00:00:00.001 Started by upstream project "autotest-per-patch" build number 126255 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.010 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.011 The recommended git tool is: git 00:00:00.011 using credential 00000000-0000-0000-0000-000000000002 00:00:00.013 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.027 Fetching changes from the remote Git repository 00:00:00.030 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.042 Using shallow fetch with depth 1 00:00:00.042 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.042 > git --version # timeout=10 00:00:00.061 > git --version # 'git version 2.39.2' 00:00:00.061 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.092 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.092 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.537 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.547 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.557 Checking out Revision 7caca6989ac753a10259529aadac5754060382af (FETCH_HEAD) 00:00:02.557 > git config core.sparsecheckout # timeout=10 00:00:02.568 > git read-tree -mu HEAD # timeout=10 00:00:02.586 > git checkout -f 7caca6989ac753a10259529aadac5754060382af # timeout=5 00:00:02.608 Commit message: "jenkins/jjb-config: Purge centos leftovers" 00:00:02.608 > git rev-list --no-walk 7caca6989ac753a10259529aadac5754060382af # timeout=10 00:00:02.748 [Pipeline] Start of Pipeline 00:00:02.764 [Pipeline] library 00:00:02.766 Loading library shm_lib@master 00:00:02.766 Library shm_lib@master is cached. Copying from home. 00:00:02.785 [Pipeline] node 00:00:02.797 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.799 [Pipeline] { 00:00:02.812 [Pipeline] catchError 00:00:02.814 [Pipeline] { 00:00:02.827 [Pipeline] wrap 00:00:02.839 [Pipeline] { 00:00:02.848 [Pipeline] stage 00:00:02.851 [Pipeline] { (Prologue) 00:00:03.053 [Pipeline] sh 00:00:03.337 + logger -p user.info -t JENKINS-CI 00:00:03.356 [Pipeline] echo 00:00:03.358 Node: WFP50 00:00:03.366 [Pipeline] sh 00:00:03.658 [Pipeline] setCustomBuildProperty 00:00:03.668 [Pipeline] echo 00:00:03.669 Cleanup processes 00:00:03.673 [Pipeline] sh 00:00:03.948 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.948 3329793 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.959 [Pipeline] sh 00:00:04.236 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.236 ++ grep -v 'sudo pgrep' 00:00:04.236 ++ awk '{print $1}' 00:00:04.236 + sudo kill -9 00:00:04.236 + true 00:00:04.248 [Pipeline] cleanWs 00:00:04.255 [WS-CLEANUP] Deleting project workspace... 00:00:04.255 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.261 [WS-CLEANUP] done 00:00:04.265 [Pipeline] setCustomBuildProperty 00:00:04.278 [Pipeline] sh 00:00:04.558 + sudo git config --global --replace-all safe.directory '*' 00:00:04.629 [Pipeline] httpRequest 00:00:04.645 [Pipeline] echo 00:00:04.646 Sorcerer 10.211.164.101 is alive 00:00:04.654 [Pipeline] httpRequest 00:00:04.657 HttpMethod: GET 00:00:04.658 URL: http://10.211.164.101/packages/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:04.659 Sending request to url: http://10.211.164.101/packages/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:04.661 Response Code: HTTP/1.1 200 OK 00:00:04.661 Success: Status code 200 is in the accepted range: 200,404 00:00:04.662 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:05.212 [Pipeline] sh 00:00:05.487 + tar --no-same-owner -xf jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:05.500 [Pipeline] httpRequest 00:00:05.512 [Pipeline] echo 00:00:05.514 Sorcerer 10.211.164.101 is alive 00:00:05.519 [Pipeline] httpRequest 00:00:05.523 HttpMethod: GET 00:00:05.524 URL: http://10.211.164.101/packages/spdk_406b3b1b5623aaa2c1d9028f91d64100a2de2b96.tar.gz 00:00:05.524 Sending request to url: http://10.211.164.101/packages/spdk_406b3b1b5623aaa2c1d9028f91d64100a2de2b96.tar.gz 00:00:05.534 Response Code: HTTP/1.1 200 OK 00:00:05.534 Success: Status code 200 is in the accepted range: 200,404 00:00:05.535 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_406b3b1b5623aaa2c1d9028f91d64100a2de2b96.tar.gz 00:00:47.421 [Pipeline] sh 00:00:47.698 + tar --no-same-owner -xf spdk_406b3b1b5623aaa2c1d9028f91d64100a2de2b96.tar.gz 00:00:51.912 [Pipeline] sh 00:00:52.196 + git -C spdk log --oneline -n5 00:00:52.196 406b3b1b5 util: allow NULL saddr/caddr for spdk_net_getaddr 00:00:52.196 1053f1b13 util: don't allow users to pass caddr/cport for listen sockets 00:00:52.196 0663932f5 util: add spdk_net_getaddr 00:00:52.196 9da437b46 util: move module/sock/sock_kernel.h contents to net.c 00:00:52.196 35c6d81e6 util: add spdk_net_get_interface_name 00:00:52.209 [Pipeline] } 00:00:52.227 [Pipeline] // stage 00:00:52.240 [Pipeline] stage 00:00:52.244 [Pipeline] { (Prepare) 00:00:52.267 [Pipeline] writeFile 00:00:52.288 [Pipeline] sh 00:00:52.573 + logger -p user.info -t JENKINS-CI 00:00:52.621 [Pipeline] sh 00:00:52.903 + logger -p user.info -t JENKINS-CI 00:00:52.917 [Pipeline] sh 00:00:53.199 + cat autorun-spdk.conf 00:00:53.199 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:53.199 SPDK_TEST_BLOCKDEV=1 00:00:53.199 SPDK_TEST_ISAL=1 00:00:53.199 SPDK_TEST_CRYPTO=1 00:00:53.199 SPDK_TEST_REDUCE=1 00:00:53.199 SPDK_TEST_VBDEV_COMPRESS=1 00:00:53.199 SPDK_RUN_UBSAN=1 00:00:53.205 RUN_NIGHTLY=0 00:00:53.211 [Pipeline] readFile 00:00:53.239 [Pipeline] withEnv 00:00:53.241 [Pipeline] { 00:00:53.257 [Pipeline] sh 00:00:53.540 + set -ex 00:00:53.541 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:53.541 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:53.541 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:53.541 ++ SPDK_TEST_BLOCKDEV=1 00:00:53.541 ++ SPDK_TEST_ISAL=1 00:00:53.541 ++ SPDK_TEST_CRYPTO=1 00:00:53.541 ++ SPDK_TEST_REDUCE=1 00:00:53.541 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:53.541 ++ SPDK_RUN_UBSAN=1 00:00:53.541 ++ RUN_NIGHTLY=0 00:00:53.541 + case $SPDK_TEST_NVMF_NICS in 00:00:53.541 + DRIVERS= 00:00:53.541 + [[ -n '' ]] 00:00:53.541 + exit 0 00:00:53.549 [Pipeline] } 00:00:53.566 [Pipeline] // withEnv 00:00:53.571 [Pipeline] } 00:00:53.588 [Pipeline] // stage 00:00:53.598 [Pipeline] catchError 00:00:53.600 [Pipeline] { 00:00:53.614 [Pipeline] timeout 00:00:53.614 Timeout set to expire in 40 min 00:00:53.616 [Pipeline] { 00:00:53.628 [Pipeline] stage 00:00:53.629 [Pipeline] { (Tests) 00:00:53.643 [Pipeline] sh 00:00:53.927 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:53.927 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:53.927 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:53.927 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:53.927 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:53.927 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:53.927 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:53.927 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:53.927 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:53.927 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:53.927 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:53.927 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:53.927 + source /etc/os-release 00:00:53.927 ++ NAME='Fedora Linux' 00:00:53.927 ++ VERSION='38 (Cloud Edition)' 00:00:53.927 ++ ID=fedora 00:00:53.927 ++ VERSION_ID=38 00:00:53.927 ++ VERSION_CODENAME= 00:00:53.927 ++ PLATFORM_ID=platform:f38 00:00:53.927 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:53.927 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:53.927 ++ LOGO=fedora-logo-icon 00:00:53.927 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:53.927 ++ HOME_URL=https://fedoraproject.org/ 00:00:53.927 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:53.927 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:53.927 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:53.927 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:53.927 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:53.927 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:53.927 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:53.927 ++ SUPPORT_END=2024-05-14 00:00:53.927 ++ VARIANT='Cloud Edition' 00:00:53.927 ++ VARIANT_ID=cloud 00:00:53.927 + uname -a 00:00:53.927 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:53.927 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:57.212 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:00:57.212 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:00:57.212 Hugepages 00:00:57.212 node hugesize free / total 00:00:57.212 node0 1048576kB 0 / 0 00:00:57.212 node0 2048kB 0 / 0 00:00:57.212 node1 1048576kB 0 / 0 00:00:57.212 node1 2048kB 0 / 0 00:00:57.212 00:00:57.212 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:57.212 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:57.212 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:57.212 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:57.212 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:57.212 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:57.212 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:57.212 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:57.212 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:57.470 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:00:57.470 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:57.470 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:57.470 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:57.470 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:57.470 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:57.471 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:57.471 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:57.471 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:57.471 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:00:57.471 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:00:57.471 + rm -f /tmp/spdk-ld-path 00:00:57.471 + source autorun-spdk.conf 00:00:57.471 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:57.471 ++ SPDK_TEST_BLOCKDEV=1 00:00:57.471 ++ SPDK_TEST_ISAL=1 00:00:57.471 ++ SPDK_TEST_CRYPTO=1 00:00:57.471 ++ SPDK_TEST_REDUCE=1 00:00:57.471 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:57.471 ++ SPDK_RUN_UBSAN=1 00:00:57.471 ++ RUN_NIGHTLY=0 00:00:57.471 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:57.471 + [[ -n '' ]] 00:00:57.471 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:57.471 + for M in /var/spdk/build-*-manifest.txt 00:00:57.471 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:57.471 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:57.471 + for M in /var/spdk/build-*-manifest.txt 00:00:57.471 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:57.471 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:57.471 ++ uname 00:00:57.471 + [[ Linux == \L\i\n\u\x ]] 00:00:57.471 + sudo dmesg -T 00:00:57.471 + sudo dmesg --clear 00:00:57.471 + dmesg_pid=3330766 00:00:57.471 + [[ Fedora Linux == FreeBSD ]] 00:00:57.471 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:57.471 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:57.471 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:57.471 + [[ -x /usr/src/fio-static/fio ]] 00:00:57.471 + export FIO_BIN=/usr/src/fio-static/fio 00:00:57.471 + FIO_BIN=/usr/src/fio-static/fio 00:00:57.471 + sudo dmesg -Tw 00:00:57.471 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:57.471 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:57.471 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:57.471 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:57.471 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:57.471 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:57.471 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:57.471 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:57.471 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:57.471 Test configuration: 00:00:57.471 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:57.471 SPDK_TEST_BLOCKDEV=1 00:00:57.471 SPDK_TEST_ISAL=1 00:00:57.471 SPDK_TEST_CRYPTO=1 00:00:57.471 SPDK_TEST_REDUCE=1 00:00:57.471 SPDK_TEST_VBDEV_COMPRESS=1 00:00:57.471 SPDK_RUN_UBSAN=1 00:00:57.730 RUN_NIGHTLY=0 23:55:44 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:57.730 23:55:44 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:57.730 23:55:44 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:57.730 23:55:44 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:57.730 23:55:44 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:57.730 23:55:44 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:57.730 23:55:44 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:57.730 23:55:44 -- paths/export.sh@5 -- $ export PATH 00:00:57.730 23:55:44 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:57.730 23:55:44 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:57.730 23:55:44 -- common/autobuild_common.sh@444 -- $ date +%s 00:00:57.730 23:55:44 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721080544.XXXXXX 00:00:57.730 23:55:44 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721080544.Rmf9vq 00:00:57.730 23:55:44 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:00:57.730 23:55:44 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:00:57.730 23:55:44 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:57.730 23:55:44 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:57.730 23:55:44 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:57.730 23:55:44 -- common/autobuild_common.sh@460 -- $ get_config_params 00:00:57.730 23:55:44 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:00:57.730 23:55:44 -- common/autotest_common.sh@10 -- $ set +x 00:00:57.730 23:55:44 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:00:57.730 23:55:44 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:00:57.730 23:55:44 -- pm/common@17 -- $ local monitor 00:00:57.730 23:55:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:57.730 23:55:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:57.730 23:55:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:57.730 23:55:44 -- pm/common@21 -- $ date +%s 00:00:57.730 23:55:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:57.730 23:55:44 -- pm/common@21 -- $ date +%s 00:00:57.730 23:55:44 -- pm/common@25 -- $ sleep 1 00:00:57.730 23:55:44 -- pm/common@21 -- $ date +%s 00:00:57.730 23:55:44 -- pm/common@21 -- $ date +%s 00:00:57.730 23:55:44 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721080544 00:00:57.730 23:55:44 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721080544 00:00:57.730 23:55:44 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721080544 00:00:57.730 23:55:44 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721080544 00:00:57.730 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721080544_collect-vmstat.pm.log 00:00:57.730 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721080544_collect-cpu-load.pm.log 00:00:57.730 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721080544_collect-cpu-temp.pm.log 00:00:57.730 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721080544_collect-bmc-pm.bmc.pm.log 00:00:58.668 23:55:45 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:00:58.668 23:55:45 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:58.668 23:55:45 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:58.668 23:55:45 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:58.668 23:55:45 -- spdk/autobuild.sh@16 -- $ date -u 00:00:58.668 Mon Jul 15 09:55:45 PM UTC 2024 00:00:58.668 23:55:45 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:58.668 v24.09-pre-219-g406b3b1b5 00:00:58.668 23:55:45 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:58.668 23:55:45 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:58.668 23:55:45 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:58.668 23:55:45 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:00:58.668 23:55:45 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:00:58.668 23:55:45 -- common/autotest_common.sh@10 -- $ set +x 00:00:58.668 ************************************ 00:00:58.668 START TEST ubsan 00:00:58.668 ************************************ 00:00:58.668 23:55:45 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:00:58.668 using ubsan 00:00:58.668 00:00:58.668 real 0m0.001s 00:00:58.668 user 0m0.001s 00:00:58.668 sys 0m0.000s 00:00:58.668 23:55:45 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:00:58.668 23:55:45 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:58.668 ************************************ 00:00:58.668 END TEST ubsan 00:00:58.668 ************************************ 00:00:58.927 23:55:45 -- common/autotest_common.sh@1142 -- $ return 0 00:00:58.927 23:55:45 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:58.927 23:55:45 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:58.927 23:55:45 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:58.927 23:55:45 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:58.927 23:55:45 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:58.927 23:55:45 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:58.927 23:55:45 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:58.927 23:55:45 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:58.927 23:55:45 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:58.927 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:58.927 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:59.516 Using 'verbs' RDMA provider 00:01:15.781 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:30.695 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:30.695 Creating mk/config.mk...done. 00:01:30.695 Creating mk/cc.flags.mk...done. 00:01:30.695 Type 'make' to build. 00:01:30.695 23:56:16 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:30.695 23:56:16 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:30.695 23:56:16 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:30.695 23:56:16 -- common/autotest_common.sh@10 -- $ set +x 00:01:30.695 ************************************ 00:01:30.695 START TEST make 00:01:30.695 ************************************ 00:01:30.695 23:56:16 make -- common/autotest_common.sh@1123 -- $ make -j72 00:01:30.695 make[1]: Nothing to be done for 'all'. 00:02:09.459 The Meson build system 00:02:09.459 Version: 1.3.1 00:02:09.459 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:09.459 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:09.459 Build type: native build 00:02:09.459 Program cat found: YES (/usr/bin/cat) 00:02:09.459 Project name: DPDK 00:02:09.459 Project version: 24.03.0 00:02:09.459 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:09.459 C linker for the host machine: cc ld.bfd 2.39-16 00:02:09.459 Host machine cpu family: x86_64 00:02:09.459 Host machine cpu: x86_64 00:02:09.459 Message: ## Building in Developer Mode ## 00:02:09.459 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:09.459 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:09.459 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:09.459 Program python3 found: YES (/usr/bin/python3) 00:02:09.459 Program cat found: YES (/usr/bin/cat) 00:02:09.459 Compiler for C supports arguments -march=native: YES 00:02:09.459 Checking for size of "void *" : 8 00:02:09.459 Checking for size of "void *" : 8 (cached) 00:02:09.459 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:09.459 Library m found: YES 00:02:09.459 Library numa found: YES 00:02:09.459 Has header "numaif.h" : YES 00:02:09.459 Library fdt found: NO 00:02:09.459 Library execinfo found: NO 00:02:09.459 Has header "execinfo.h" : YES 00:02:09.459 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:09.459 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:09.459 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:09.459 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:09.459 Run-time dependency openssl found: YES 3.0.9 00:02:09.459 Run-time dependency libpcap found: YES 1.10.4 00:02:09.459 Has header "pcap.h" with dependency libpcap: YES 00:02:09.459 Compiler for C supports arguments -Wcast-qual: YES 00:02:09.459 Compiler for C supports arguments -Wdeprecated: YES 00:02:09.459 Compiler for C supports arguments -Wformat: YES 00:02:09.459 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:09.459 Compiler for C supports arguments -Wformat-security: NO 00:02:09.459 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:09.459 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:09.459 Compiler for C supports arguments -Wnested-externs: YES 00:02:09.459 Compiler for C supports arguments -Wold-style-definition: YES 00:02:09.459 Compiler for C supports arguments -Wpointer-arith: YES 00:02:09.459 Compiler for C supports arguments -Wsign-compare: YES 00:02:09.459 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:09.459 Compiler for C supports arguments -Wundef: YES 00:02:09.460 Compiler for C supports arguments -Wwrite-strings: YES 00:02:09.460 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:09.460 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:09.460 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:09.460 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:09.460 Program objdump found: YES (/usr/bin/objdump) 00:02:09.460 Compiler for C supports arguments -mavx512f: YES 00:02:09.460 Checking if "AVX512 checking" compiles: YES 00:02:09.460 Fetching value of define "__SSE4_2__" : 1 00:02:09.460 Fetching value of define "__AES__" : 1 00:02:09.460 Fetching value of define "__AVX__" : 1 00:02:09.460 Fetching value of define "__AVX2__" : 1 00:02:09.460 Fetching value of define "__AVX512BW__" : 1 00:02:09.460 Fetching value of define "__AVX512CD__" : 1 00:02:09.460 Fetching value of define "__AVX512DQ__" : 1 00:02:09.460 Fetching value of define "__AVX512F__" : 1 00:02:09.460 Fetching value of define "__AVX512VL__" : 1 00:02:09.460 Fetching value of define "__PCLMUL__" : 1 00:02:09.460 Fetching value of define "__RDRND__" : 1 00:02:09.460 Fetching value of define "__RDSEED__" : 1 00:02:09.460 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:09.460 Fetching value of define "__znver1__" : (undefined) 00:02:09.460 Fetching value of define "__znver2__" : (undefined) 00:02:09.460 Fetching value of define "__znver3__" : (undefined) 00:02:09.460 Fetching value of define "__znver4__" : (undefined) 00:02:09.460 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:09.460 Message: lib/log: Defining dependency "log" 00:02:09.460 Message: lib/kvargs: Defining dependency "kvargs" 00:02:09.460 Message: lib/telemetry: Defining dependency "telemetry" 00:02:09.460 Checking for function "getentropy" : NO 00:02:09.460 Message: lib/eal: Defining dependency "eal" 00:02:09.460 Message: lib/ring: Defining dependency "ring" 00:02:09.460 Message: lib/rcu: Defining dependency "rcu" 00:02:09.460 Message: lib/mempool: Defining dependency "mempool" 00:02:09.460 Message: lib/mbuf: Defining dependency "mbuf" 00:02:09.460 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:09.460 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:09.460 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:09.460 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:09.460 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:09.460 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:09.460 Compiler for C supports arguments -mpclmul: YES 00:02:09.460 Compiler for C supports arguments -maes: YES 00:02:09.460 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:09.460 Compiler for C supports arguments -mavx512bw: YES 00:02:09.460 Compiler for C supports arguments -mavx512dq: YES 00:02:09.460 Compiler for C supports arguments -mavx512vl: YES 00:02:09.460 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:09.460 Compiler for C supports arguments -mavx2: YES 00:02:09.460 Compiler for C supports arguments -mavx: YES 00:02:09.460 Message: lib/net: Defining dependency "net" 00:02:09.460 Message: lib/meter: Defining dependency "meter" 00:02:09.460 Message: lib/ethdev: Defining dependency "ethdev" 00:02:09.460 Message: lib/pci: Defining dependency "pci" 00:02:09.460 Message: lib/cmdline: Defining dependency "cmdline" 00:02:09.460 Message: lib/hash: Defining dependency "hash" 00:02:09.460 Message: lib/timer: Defining dependency "timer" 00:02:09.460 Message: lib/compressdev: Defining dependency "compressdev" 00:02:09.460 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:09.460 Message: lib/dmadev: Defining dependency "dmadev" 00:02:09.460 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:09.460 Message: lib/power: Defining dependency "power" 00:02:09.460 Message: lib/reorder: Defining dependency "reorder" 00:02:09.460 Message: lib/security: Defining dependency "security" 00:02:09.460 Has header "linux/userfaultfd.h" : YES 00:02:09.460 Has header "linux/vduse.h" : YES 00:02:09.460 Message: lib/vhost: Defining dependency "vhost" 00:02:09.460 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:09.460 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:09.460 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:09.460 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:09.460 Compiler for C supports arguments -std=c11: YES 00:02:09.460 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:09.460 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:09.460 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:09.460 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:09.460 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:09.460 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:09.460 Library mtcr_ul found: NO 00:02:09.460 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:09.460 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:09.460 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:09.460 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:16.034 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:16.034 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:16.034 Configuring mlx5_autoconf.h using configuration 00:02:16.034 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:16.034 Run-time dependency libcrypto found: YES 3.0.9 00:02:16.034 Library IPSec_MB found: YES 00:02:16.034 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:16.034 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:16.034 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:16.034 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:16.034 Library IPSec_MB found: YES 00:02:16.034 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:16.034 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:16.034 Compiler for C supports arguments -std=c11: YES (cached) 00:02:16.034 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:16.034 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:16.034 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:16.034 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:16.034 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:16.034 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:16.034 Library libisal found: NO 00:02:16.034 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:16.034 Compiler for C supports arguments -std=c11: YES (cached) 00:02:16.034 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:16.034 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:16.034 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:16.034 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:16.034 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:16.034 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:16.034 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:16.034 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:16.034 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:16.034 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:16.034 Program doxygen found: YES (/usr/bin/doxygen) 00:02:16.034 Configuring doxy-api-html.conf using configuration 00:02:16.034 Configuring doxy-api-man.conf using configuration 00:02:16.034 Program mandb found: YES (/usr/bin/mandb) 00:02:16.034 Program sphinx-build found: NO 00:02:16.034 Configuring rte_build_config.h using configuration 00:02:16.034 Message: 00:02:16.034 ================= 00:02:16.034 Applications Enabled 00:02:16.034 ================= 00:02:16.034 00:02:16.034 apps: 00:02:16.034 00:02:16.034 00:02:16.034 Message: 00:02:16.034 ================= 00:02:16.034 Libraries Enabled 00:02:16.034 ================= 00:02:16.034 00:02:16.034 libs: 00:02:16.034 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:16.034 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:16.034 cryptodev, dmadev, power, reorder, security, vhost, 00:02:16.034 00:02:16.034 Message: 00:02:16.034 =============== 00:02:16.034 Drivers Enabled 00:02:16.034 =============== 00:02:16.034 00:02:16.034 common: 00:02:16.034 mlx5, qat, 00:02:16.034 bus: 00:02:16.035 auxiliary, pci, vdev, 00:02:16.035 mempool: 00:02:16.035 ring, 00:02:16.035 dma: 00:02:16.035 00:02:16.035 net: 00:02:16.035 00:02:16.035 crypto: 00:02:16.035 ipsec_mb, mlx5, 00:02:16.035 compress: 00:02:16.035 isal, mlx5, 00:02:16.035 vdpa: 00:02:16.035 00:02:16.035 00:02:16.035 Message: 00:02:16.035 ================= 00:02:16.035 Content Skipped 00:02:16.035 ================= 00:02:16.035 00:02:16.035 apps: 00:02:16.035 dumpcap: explicitly disabled via build config 00:02:16.035 graph: explicitly disabled via build config 00:02:16.035 pdump: explicitly disabled via build config 00:02:16.035 proc-info: explicitly disabled via build config 00:02:16.035 test-acl: explicitly disabled via build config 00:02:16.035 test-bbdev: explicitly disabled via build config 00:02:16.035 test-cmdline: explicitly disabled via build config 00:02:16.035 test-compress-perf: explicitly disabled via build config 00:02:16.035 test-crypto-perf: explicitly disabled via build config 00:02:16.035 test-dma-perf: explicitly disabled via build config 00:02:16.035 test-eventdev: explicitly disabled via build config 00:02:16.035 test-fib: explicitly disabled via build config 00:02:16.035 test-flow-perf: explicitly disabled via build config 00:02:16.035 test-gpudev: explicitly disabled via build config 00:02:16.035 test-mldev: explicitly disabled via build config 00:02:16.035 test-pipeline: explicitly disabled via build config 00:02:16.035 test-pmd: explicitly disabled via build config 00:02:16.035 test-regex: explicitly disabled via build config 00:02:16.035 test-sad: explicitly disabled via build config 00:02:16.035 test-security-perf: explicitly disabled via build config 00:02:16.035 00:02:16.035 libs: 00:02:16.035 argparse: explicitly disabled via build config 00:02:16.035 metrics: explicitly disabled via build config 00:02:16.035 acl: explicitly disabled via build config 00:02:16.035 bbdev: explicitly disabled via build config 00:02:16.035 bitratestats: explicitly disabled via build config 00:02:16.035 bpf: explicitly disabled via build config 00:02:16.035 cfgfile: explicitly disabled via build config 00:02:16.035 distributor: explicitly disabled via build config 00:02:16.035 efd: explicitly disabled via build config 00:02:16.035 eventdev: explicitly disabled via build config 00:02:16.035 dispatcher: explicitly disabled via build config 00:02:16.035 gpudev: explicitly disabled via build config 00:02:16.035 gro: explicitly disabled via build config 00:02:16.035 gso: explicitly disabled via build config 00:02:16.035 ip_frag: explicitly disabled via build config 00:02:16.035 jobstats: explicitly disabled via build config 00:02:16.035 latencystats: explicitly disabled via build config 00:02:16.035 lpm: explicitly disabled via build config 00:02:16.035 member: explicitly disabled via build config 00:02:16.035 pcapng: explicitly disabled via build config 00:02:16.035 rawdev: explicitly disabled via build config 00:02:16.035 regexdev: explicitly disabled via build config 00:02:16.035 mldev: explicitly disabled via build config 00:02:16.035 rib: explicitly disabled via build config 00:02:16.035 sched: explicitly disabled via build config 00:02:16.035 stack: explicitly disabled via build config 00:02:16.035 ipsec: explicitly disabled via build config 00:02:16.035 pdcp: explicitly disabled via build config 00:02:16.035 fib: explicitly disabled via build config 00:02:16.035 port: explicitly disabled via build config 00:02:16.035 pdump: explicitly disabled via build config 00:02:16.035 table: explicitly disabled via build config 00:02:16.035 pipeline: explicitly disabled via build config 00:02:16.035 graph: explicitly disabled via build config 00:02:16.035 node: explicitly disabled via build config 00:02:16.035 00:02:16.035 drivers: 00:02:16.035 common/cpt: not in enabled drivers build config 00:02:16.035 common/dpaax: not in enabled drivers build config 00:02:16.035 common/iavf: not in enabled drivers build config 00:02:16.035 common/idpf: not in enabled drivers build config 00:02:16.035 common/ionic: not in enabled drivers build config 00:02:16.035 common/mvep: not in enabled drivers build config 00:02:16.035 common/octeontx: not in enabled drivers build config 00:02:16.035 bus/cdx: not in enabled drivers build config 00:02:16.035 bus/dpaa: not in enabled drivers build config 00:02:16.035 bus/fslmc: not in enabled drivers build config 00:02:16.035 bus/ifpga: not in enabled drivers build config 00:02:16.035 bus/platform: not in enabled drivers build config 00:02:16.035 bus/uacce: not in enabled drivers build config 00:02:16.035 bus/vmbus: not in enabled drivers build config 00:02:16.035 common/cnxk: not in enabled drivers build config 00:02:16.035 common/nfp: not in enabled drivers build config 00:02:16.035 common/nitrox: not in enabled drivers build config 00:02:16.035 common/sfc_efx: not in enabled drivers build config 00:02:16.035 mempool/bucket: not in enabled drivers build config 00:02:16.035 mempool/cnxk: not in enabled drivers build config 00:02:16.035 mempool/dpaa: not in enabled drivers build config 00:02:16.035 mempool/dpaa2: not in enabled drivers build config 00:02:16.035 mempool/octeontx: not in enabled drivers build config 00:02:16.035 mempool/stack: not in enabled drivers build config 00:02:16.035 dma/cnxk: not in enabled drivers build config 00:02:16.035 dma/dpaa: not in enabled drivers build config 00:02:16.035 dma/dpaa2: not in enabled drivers build config 00:02:16.035 dma/hisilicon: not in enabled drivers build config 00:02:16.035 dma/idxd: not in enabled drivers build config 00:02:16.035 dma/ioat: not in enabled drivers build config 00:02:16.035 dma/skeleton: not in enabled drivers build config 00:02:16.035 net/af_packet: not in enabled drivers build config 00:02:16.035 net/af_xdp: not in enabled drivers build config 00:02:16.035 net/ark: not in enabled drivers build config 00:02:16.035 net/atlantic: not in enabled drivers build config 00:02:16.035 net/avp: not in enabled drivers build config 00:02:16.035 net/axgbe: not in enabled drivers build config 00:02:16.035 net/bnx2x: not in enabled drivers build config 00:02:16.035 net/bnxt: not in enabled drivers build config 00:02:16.035 net/bonding: not in enabled drivers build config 00:02:16.035 net/cnxk: not in enabled drivers build config 00:02:16.035 net/cpfl: not in enabled drivers build config 00:02:16.035 net/cxgbe: not in enabled drivers build config 00:02:16.035 net/dpaa: not in enabled drivers build config 00:02:16.035 net/dpaa2: not in enabled drivers build config 00:02:16.035 net/e1000: not in enabled drivers build config 00:02:16.035 net/ena: not in enabled drivers build config 00:02:16.035 net/enetc: not in enabled drivers build config 00:02:16.035 net/enetfec: not in enabled drivers build config 00:02:16.035 net/enic: not in enabled drivers build config 00:02:16.035 net/failsafe: not in enabled drivers build config 00:02:16.035 net/fm10k: not in enabled drivers build config 00:02:16.035 net/gve: not in enabled drivers build config 00:02:16.035 net/hinic: not in enabled drivers build config 00:02:16.035 net/hns3: not in enabled drivers build config 00:02:16.035 net/i40e: not in enabled drivers build config 00:02:16.035 net/iavf: not in enabled drivers build config 00:02:16.035 net/ice: not in enabled drivers build config 00:02:16.035 net/idpf: not in enabled drivers build config 00:02:16.035 net/igc: not in enabled drivers build config 00:02:16.035 net/ionic: not in enabled drivers build config 00:02:16.035 net/ipn3ke: not in enabled drivers build config 00:02:16.035 net/ixgbe: not in enabled drivers build config 00:02:16.035 net/mana: not in enabled drivers build config 00:02:16.035 net/memif: not in enabled drivers build config 00:02:16.035 net/mlx4: not in enabled drivers build config 00:02:16.035 net/mlx5: not in enabled drivers build config 00:02:16.035 net/mvneta: not in enabled drivers build config 00:02:16.035 net/mvpp2: not in enabled drivers build config 00:02:16.035 net/netvsc: not in enabled drivers build config 00:02:16.035 net/nfb: not in enabled drivers build config 00:02:16.035 net/nfp: not in enabled drivers build config 00:02:16.035 net/ngbe: not in enabled drivers build config 00:02:16.035 net/null: not in enabled drivers build config 00:02:16.035 net/octeontx: not in enabled drivers build config 00:02:16.035 net/octeon_ep: not in enabled drivers build config 00:02:16.035 net/pcap: not in enabled drivers build config 00:02:16.035 net/pfe: not in enabled drivers build config 00:02:16.035 net/qede: not in enabled drivers build config 00:02:16.035 net/ring: not in enabled drivers build config 00:02:16.035 net/sfc: not in enabled drivers build config 00:02:16.035 net/softnic: not in enabled drivers build config 00:02:16.035 net/tap: not in enabled drivers build config 00:02:16.035 net/thunderx: not in enabled drivers build config 00:02:16.035 net/txgbe: not in enabled drivers build config 00:02:16.035 net/vdev_netvsc: not in enabled drivers build config 00:02:16.035 net/vhost: not in enabled drivers build config 00:02:16.035 net/virtio: not in enabled drivers build config 00:02:16.035 net/vmxnet3: not in enabled drivers build config 00:02:16.035 raw/*: missing internal dependency, "rawdev" 00:02:16.035 crypto/armv8: not in enabled drivers build config 00:02:16.035 crypto/bcmfs: not in enabled drivers build config 00:02:16.035 crypto/caam_jr: not in enabled drivers build config 00:02:16.035 crypto/ccp: not in enabled drivers build config 00:02:16.035 crypto/cnxk: not in enabled drivers build config 00:02:16.035 crypto/dpaa_sec: not in enabled drivers build config 00:02:16.035 crypto/dpaa2_sec: not in enabled drivers build config 00:02:16.035 crypto/mvsam: not in enabled drivers build config 00:02:16.035 crypto/nitrox: not in enabled drivers build config 00:02:16.035 crypto/null: not in enabled drivers build config 00:02:16.035 crypto/octeontx: not in enabled drivers build config 00:02:16.035 crypto/openssl: not in enabled drivers build config 00:02:16.035 crypto/scheduler: not in enabled drivers build config 00:02:16.035 crypto/uadk: not in enabled drivers build config 00:02:16.035 crypto/virtio: not in enabled drivers build config 00:02:16.035 compress/nitrox: not in enabled drivers build config 00:02:16.035 compress/octeontx: not in enabled drivers build config 00:02:16.035 compress/zlib: not in enabled drivers build config 00:02:16.035 regex/*: missing internal dependency, "regexdev" 00:02:16.035 ml/*: missing internal dependency, "mldev" 00:02:16.035 vdpa/ifc: not in enabled drivers build config 00:02:16.035 vdpa/mlx5: not in enabled drivers build config 00:02:16.035 vdpa/nfp: not in enabled drivers build config 00:02:16.035 vdpa/sfc: not in enabled drivers build config 00:02:16.035 event/*: missing internal dependency, "eventdev" 00:02:16.035 baseband/*: missing internal dependency, "bbdev" 00:02:16.035 gpu/*: missing internal dependency, "gpudev" 00:02:16.035 00:02:16.035 00:02:16.295 Build targets in project: 115 00:02:16.295 00:02:16.295 DPDK 24.03.0 00:02:16.295 00:02:16.295 User defined options 00:02:16.295 buildtype : debug 00:02:16.295 default_library : shared 00:02:16.295 libdir : lib 00:02:16.295 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:16.295 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:16.295 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:16.295 cpu_instruction_set: native 00:02:16.295 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:02:16.295 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,argparse,pcapng,bbdev 00:02:16.295 enable_docs : false 00:02:16.295 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:16.295 enable_kmods : false 00:02:16.295 max_lcores : 128 00:02:16.295 tests : false 00:02:16.295 00:02:16.295 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:16.870 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:16.870 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:16.870 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:16.870 [3/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:16.870 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:16.870 [5/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:16.870 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:16.870 [7/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:16.870 [8/378] Linking static target lib/librte_kvargs.a 00:02:16.870 [9/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:16.870 [10/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:16.870 [11/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:16.870 [12/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:16.870 [13/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:16.870 [14/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:16.870 [15/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:16.870 [16/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:17.129 [17/378] Linking static target lib/librte_log.a 00:02:17.129 [18/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:17.129 [19/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:17.412 [20/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:17.412 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:17.412 [22/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:17.412 [23/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:17.412 [24/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:17.412 [25/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:17.412 [26/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.412 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:17.412 [28/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:17.412 [29/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:17.412 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:17.412 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:17.412 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:17.412 [33/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:17.412 [34/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:17.412 [35/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:17.412 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:17.412 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:17.412 [38/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:17.412 [39/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:17.412 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:17.412 [41/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:17.412 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:17.412 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:17.412 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:17.412 [45/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:17.412 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:17.412 [47/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:17.677 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:17.677 [49/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:17.677 [50/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:17.677 [51/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:17.677 [52/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:17.677 [53/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:17.677 [54/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:17.677 [55/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:17.677 [56/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:17.677 [57/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:17.677 [58/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:17.677 [59/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:17.677 [60/378] Linking static target lib/librte_telemetry.a 00:02:17.677 [61/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:17.677 [62/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:17.677 [63/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:17.677 [64/378] Linking static target lib/librte_ring.a 00:02:17.677 [65/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:17.677 [66/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:17.677 [67/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:17.677 [68/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:17.677 [69/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:17.677 [70/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:17.677 [71/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:17.677 [72/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:17.677 [73/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:17.677 [74/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:17.677 [75/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:17.677 [76/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:17.677 [77/378] Linking static target lib/librte_pci.a 00:02:17.677 [78/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:17.677 [79/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:17.677 [80/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:17.677 [81/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:17.677 [82/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:17.677 [83/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:17.677 [84/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:17.677 [85/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:17.677 [86/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:17.677 [87/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:17.677 [88/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:17.677 [89/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:17.677 [90/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:17.677 [91/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:17.677 [92/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:17.677 [93/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:17.677 [94/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:17.677 [95/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:17.677 [96/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:17.677 [97/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:17.677 [98/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:17.940 [99/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:17.940 [100/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:17.940 [101/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:17.940 [102/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:17.940 [103/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:17.940 [104/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:17.940 [105/378] Linking static target lib/librte_mempool.a 00:02:17.940 [106/378] Linking static target lib/librte_net.a 00:02:17.940 [107/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:17.940 [108/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:17.940 [109/378] Linking static target lib/librte_meter.a 00:02:17.940 [110/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:17.940 [111/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:17.940 [112/378] Linking static target lib/librte_rcu.a 00:02:17.940 [113/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:17.940 [114/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:18.206 [115/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:18.206 [116/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.206 [117/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:18.206 [118/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:18.206 [119/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:18.206 [120/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:18.206 [121/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:18.206 [122/378] Linking static target lib/librte_mbuf.a 00:02:18.206 [123/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:18.206 [124/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:18.206 [125/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:18.206 [126/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.206 [127/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:18.206 [128/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:18.206 [129/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:18.206 [130/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.206 [131/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:18.206 [132/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:18.206 [133/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:18.206 [134/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:18.206 [135/378] Linking static target lib/librte_cmdline.a 00:02:18.206 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:18.206 [137/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:18.206 [138/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:18.206 [139/378] Linking static target lib/librte_timer.a 00:02:18.206 [140/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:18.206 [141/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:18.476 [142/378] Linking target lib/librte_log.so.24.1 00:02:18.476 [143/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.476 [144/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:18.476 [145/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:18.476 [146/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:18.476 [147/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:18.476 [148/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:18.476 [149/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:18.476 [150/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:18.476 [151/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:18.476 [152/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:18.476 [153/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:18.476 [154/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:18.476 [155/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:18.476 [156/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:18.476 [157/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:18.476 [158/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:18.476 [159/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.476 [160/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:18.476 [161/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:18.476 [162/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.476 [163/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:18.476 [164/378] Linking static target lib/librte_eal.a 00:02:18.476 [165/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:18.476 [166/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:18.476 [167/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:18.476 [168/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:18.476 [169/378] Linking static target lib/librte_compressdev.a 00:02:18.476 [170/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:18.476 [171/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:18.476 [172/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:18.476 [173/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.476 [174/378] Linking static target lib/librte_power.a 00:02:18.476 [175/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:18.476 [176/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:18.476 [177/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:18.476 [178/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:18.476 [179/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:18.737 [180/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:18.737 [181/378] Linking static target lib/librte_reorder.a 00:02:18.737 [182/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:18.737 [183/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:18.737 [184/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:18.737 [185/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:18.737 [186/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:18.737 [187/378] Linking target lib/librte_kvargs.so.24.1 00:02:18.737 [188/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:18.737 [189/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:18.737 [190/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:18.737 [191/378] Linking target lib/librte_telemetry.so.24.1 00:02:18.737 [192/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:18.737 [193/378] Linking static target lib/librte_security.a 00:02:18.737 [194/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:18.737 [195/378] Linking static target lib/librte_dmadev.a 00:02:18.737 [196/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:18.997 [197/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:18.997 [198/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:18.997 [199/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:18.997 [200/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:18.997 [201/378] Linking static target lib/librte_hash.a 00:02:18.997 [202/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:18.997 [203/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:18.997 [204/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:18.997 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:18.997 [206/378] Linking static target lib/librte_cryptodev.a 00:02:18.997 [207/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:18.997 [208/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:18.997 [209/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:18.997 [210/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:18.997 [211/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:18.997 [212/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:18.997 [213/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:18.997 [214/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:18.998 [215/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:18.998 [216/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:18.998 [217/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:18.998 [218/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:18.998 [219/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:18.998 [220/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:18.998 [221/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.998 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:18.998 [223/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:18.998 [224/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:18.998 [225/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:18.998 [226/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:18.998 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:19.257 [228/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:19.257 [229/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:19.257 [230/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:19.257 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:19.257 [232/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:19.257 [233/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:19.257 [234/378] Linking static target drivers/librte_bus_vdev.a 00:02:19.257 [235/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:19.257 [236/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:19.257 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:19.257 [238/378] Linking static target drivers/librte_bus_pci.a 00:02:19.257 [239/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:19.257 [240/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:19.257 [241/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:19.257 [242/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:19.257 [243/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.257 [244/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.257 [245/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.257 [246/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:19.257 [247/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:19.257 [248/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:19.257 [249/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:19.257 [250/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:19.257 [251/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:19.257 [252/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:19.257 [253/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:19.257 [254/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.257 [255/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.257 [256/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:19.515 [257/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:19.515 [258/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.515 [259/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.515 [260/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:19.515 [261/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:19.515 [262/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:19.515 [263/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.515 [264/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:19.515 [265/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:19.515 [266/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:19.515 [267/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:19.515 [268/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.515 [269/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:19.515 [270/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:19.774 [271/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:19.774 [272/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:19.774 [273/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:19.774 [274/378] Linking static target drivers/librte_mempool_ring.a 00:02:19.774 [275/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:19.774 [276/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.774 [277/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:19.774 [278/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:19.774 [279/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:19.774 [280/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:19.774 [281/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:19.774 [282/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:19.774 [283/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:19.774 [284/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:19.774 [285/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:19.774 [286/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:19.774 [287/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.774 [288/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:19.774 [289/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:19.774 [290/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:19.774 [291/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:19.774 [292/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:20.032 [293/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:20.032 [294/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:20.032 [295/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:20.032 [296/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:20.032 [297/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:20.032 [298/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:20.032 [299/378] Linking static target lib/librte_ethdev.a 00:02:20.032 [300/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:20.032 [301/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:20.032 [302/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:20.032 [303/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:20.032 [304/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:20.032 [305/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:20.032 [306/378] Linking static target drivers/librte_compress_mlx5.a 00:02:20.032 [307/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:20.032 [308/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:20.032 [309/378] Linking static target drivers/librte_compress_isal.a 00:02:20.032 [310/378] Linking static target drivers/librte_common_mlx5.a 00:02:20.291 [311/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:20.291 [312/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:20.549 [313/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:20.549 [314/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:20.549 [315/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:20.549 [316/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:20.549 [317/378] Linking static target drivers/librte_common_qat.a 00:02:20.808 [318/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:20.808 [319/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:21.067 [320/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:21.067 [321/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.067 [322/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:21.067 [323/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:21.067 [324/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:21.634 [325/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:21.634 [326/378] Linking static target lib/librte_vhost.a 00:02:24.168 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.774 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.062 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.965 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.965 [331/378] Linking target lib/librte_eal.so.24.1 00:02:31.965 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:31.965 [333/378] Linking target lib/librte_ring.so.24.1 00:02:31.965 [334/378] Linking target lib/librte_pci.so.24.1 00:02:31.965 [335/378] Linking target lib/librte_meter.so.24.1 00:02:31.965 [336/378] Linking target lib/librte_dmadev.so.24.1 00:02:31.965 [337/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:31.965 [338/378] Linking target lib/librte_timer.so.24.1 00:02:31.965 [339/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:32.224 [340/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:32.224 [341/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:32.224 [342/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:32.224 [343/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:32.224 [344/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:32.224 [345/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:32.224 [346/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:32.224 [347/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:32.224 [348/378] Linking target lib/librte_rcu.so.24.1 00:02:32.224 [349/378] Linking target lib/librte_mempool.so.24.1 00:02:32.482 [350/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:32.482 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:32.482 [352/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:32.482 [353/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:32.482 [354/378] Linking target lib/librte_mbuf.so.24.1 00:02:32.740 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:32.740 [356/378] Linking target lib/librte_reorder.so.24.1 00:02:32.740 [357/378] Linking target lib/librte_compressdev.so.24.1 00:02:32.740 [358/378] Linking target lib/librte_net.so.24.1 00:02:32.740 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:02:32.998 [360/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:32.998 [361/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:32.998 [362/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:32.998 [363/378] Linking target lib/librte_security.so.24.1 00:02:32.998 [364/378] Linking target lib/librte_hash.so.24.1 00:02:32.998 [365/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:32.998 [366/378] Linking target lib/librte_cmdline.so.24.1 00:02:32.998 [367/378] Linking target lib/librte_ethdev.so.24.1 00:02:33.256 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:33.256 [369/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:33.256 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:33.256 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:33.256 [372/378] Linking target lib/librte_power.so.24.1 00:02:33.256 [373/378] Linking target lib/librte_vhost.so.24.1 00:02:33.515 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:33.515 [375/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:33.515 [376/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:33.515 [377/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:33.515 [378/378] Linking target drivers/librte_common_qat.so.24.1 00:02:33.515 INFO: autodetecting backend as ninja 00:02:33.515 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:34.892 CC lib/ut_mock/mock.o 00:02:34.893 CC lib/log/log.o 00:02:34.893 CC lib/log/log_flags.o 00:02:34.893 CC lib/log/log_deprecated.o 00:02:34.893 CC lib/ut/ut.o 00:02:35.152 LIB libspdk_log.a 00:02:35.152 LIB libspdk_ut.a 00:02:35.152 LIB libspdk_ut_mock.a 00:02:35.152 SO libspdk_log.so.7.0 00:02:35.152 SO libspdk_ut_mock.so.6.0 00:02:35.152 SO libspdk_ut.so.2.0 00:02:35.152 SYMLINK libspdk_ut_mock.so 00:02:35.152 SYMLINK libspdk_log.so 00:02:35.152 SYMLINK libspdk_ut.so 00:02:35.721 CXX lib/trace_parser/trace.o 00:02:35.721 CC lib/dma/dma.o 00:02:35.721 CC lib/util/bit_array.o 00:02:35.721 CC lib/util/base64.o 00:02:35.721 CC lib/util/cpuset.o 00:02:35.721 CC lib/util/crc16.o 00:02:35.721 CC lib/util/crc32.o 00:02:35.721 CC lib/ioat/ioat.o 00:02:35.721 CC lib/util/crc32c.o 00:02:35.721 CC lib/util/crc32_ieee.o 00:02:35.721 CC lib/util/crc64.o 00:02:35.721 CC lib/util/dif.o 00:02:35.721 CC lib/util/fd_group.o 00:02:35.721 CC lib/util/fd.o 00:02:35.721 CC lib/util/file.o 00:02:35.721 CC lib/util/hexlify.o 00:02:35.721 CC lib/util/iov.o 00:02:35.721 CC lib/util/math.o 00:02:35.721 CC lib/util/net.o 00:02:35.721 CC lib/util/pipe.o 00:02:35.721 CC lib/util/strerror_tls.o 00:02:35.721 CC lib/util/string.o 00:02:35.721 CC lib/util/uuid.o 00:02:35.721 CC lib/util/xor.o 00:02:35.721 CC lib/util/zipf.o 00:02:35.721 CC lib/vfio_user/host/vfio_user.o 00:02:35.721 CC lib/vfio_user/host/vfio_user_pci.o 00:02:35.721 LIB libspdk_dma.a 00:02:35.721 SO libspdk_dma.so.4.0 00:02:35.980 SYMLINK libspdk_dma.so 00:02:35.980 LIB libspdk_ioat.a 00:02:35.980 SO libspdk_ioat.so.7.0 00:02:35.980 LIB libspdk_vfio_user.a 00:02:35.980 SO libspdk_vfio_user.so.5.0 00:02:35.980 SYMLINK libspdk_ioat.so 00:02:35.980 SYMLINK libspdk_vfio_user.so 00:02:36.239 LIB libspdk_util.a 00:02:36.239 SO libspdk_util.so.9.1 00:02:36.498 SYMLINK libspdk_util.so 00:02:36.498 LIB libspdk_trace_parser.a 00:02:36.498 SO libspdk_trace_parser.so.5.0 00:02:36.755 SYMLINK libspdk_trace_parser.so 00:02:36.755 CC lib/conf/conf.o 00:02:36.755 CC lib/rdma_provider/common.o 00:02:36.755 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:36.755 CC lib/reduce/reduce.o 00:02:36.755 CC lib/idxd/idxd.o 00:02:36.755 CC lib/idxd/idxd_user.o 00:02:36.755 CC lib/vmd/vmd.o 00:02:36.755 CC lib/env_dpdk/env.o 00:02:36.755 CC lib/idxd/idxd_kernel.o 00:02:36.755 CC lib/rdma_utils/rdma_utils.o 00:02:36.755 CC lib/json/json_parse.o 00:02:36.755 CC lib/vmd/led.o 00:02:36.755 CC lib/json/json_util.o 00:02:36.755 CC lib/env_dpdk/memory.o 00:02:36.755 CC lib/json/json_write.o 00:02:36.755 CC lib/env_dpdk/pci.o 00:02:36.755 CC lib/env_dpdk/init.o 00:02:36.755 CC lib/env_dpdk/threads.o 00:02:36.755 CC lib/env_dpdk/pci_ioat.o 00:02:36.755 CC lib/env_dpdk/pci_virtio.o 00:02:36.755 CC lib/env_dpdk/pci_idxd.o 00:02:36.755 CC lib/env_dpdk/pci_vmd.o 00:02:36.755 CC lib/env_dpdk/pci_event.o 00:02:36.755 CC lib/env_dpdk/sigbus_handler.o 00:02:36.755 CC lib/env_dpdk/pci_dpdk.o 00:02:36.755 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:36.755 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:37.013 LIB libspdk_rdma_provider.a 00:02:37.013 LIB libspdk_conf.a 00:02:37.013 SO libspdk_rdma_provider.so.6.0 00:02:37.013 SO libspdk_conf.so.6.0 00:02:37.271 LIB libspdk_json.a 00:02:37.271 SYMLINK libspdk_rdma_provider.so 00:02:37.271 SYMLINK libspdk_conf.so 00:02:37.271 LIB libspdk_rdma_utils.a 00:02:37.271 SO libspdk_json.so.6.0 00:02:37.271 SO libspdk_rdma_utils.so.1.0 00:02:37.271 SYMLINK libspdk_json.so 00:02:37.271 SYMLINK libspdk_rdma_utils.so 00:02:37.271 LIB libspdk_idxd.a 00:02:37.530 SO libspdk_idxd.so.12.0 00:02:37.530 LIB libspdk_reduce.a 00:02:37.530 SYMLINK libspdk_idxd.so 00:02:37.530 SO libspdk_reduce.so.6.0 00:02:37.530 SYMLINK libspdk_reduce.so 00:02:37.530 CC lib/jsonrpc/jsonrpc_server.o 00:02:37.530 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:37.530 CC lib/jsonrpc/jsonrpc_client.o 00:02:37.530 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:37.788 LIB libspdk_jsonrpc.a 00:02:38.046 SO libspdk_jsonrpc.so.6.0 00:02:38.046 SYMLINK libspdk_jsonrpc.so 00:02:38.046 LIB libspdk_vmd.a 00:02:38.305 SO libspdk_vmd.so.6.0 00:02:38.305 LIB libspdk_env_dpdk.a 00:02:38.305 SYMLINK libspdk_vmd.so 00:02:38.305 SO libspdk_env_dpdk.so.14.1 00:02:38.305 CC lib/rpc/rpc.o 00:02:38.563 SYMLINK libspdk_env_dpdk.so 00:02:38.563 LIB libspdk_rpc.a 00:02:38.822 SO libspdk_rpc.so.6.0 00:02:38.822 SYMLINK libspdk_rpc.so 00:02:39.082 CC lib/trace/trace.o 00:02:39.082 CC lib/trace/trace_flags.o 00:02:39.082 CC lib/trace/trace_rpc.o 00:02:39.082 CC lib/notify/notify.o 00:02:39.082 CC lib/notify/notify_rpc.o 00:02:39.082 CC lib/keyring/keyring.o 00:02:39.082 CC lib/keyring/keyring_rpc.o 00:02:39.341 LIB libspdk_notify.a 00:02:39.341 LIB libspdk_trace.a 00:02:39.341 SO libspdk_notify.so.6.0 00:02:39.600 SO libspdk_trace.so.10.0 00:02:39.600 SYMLINK libspdk_notify.so 00:02:39.600 SYMLINK libspdk_trace.so 00:02:39.600 LIB libspdk_keyring.a 00:02:39.859 SO libspdk_keyring.so.1.0 00:02:39.859 SYMLINK libspdk_keyring.so 00:02:39.859 CC lib/thread/thread.o 00:02:39.859 CC lib/thread/iobuf.o 00:02:39.859 CC lib/sock/sock.o 00:02:39.859 CC lib/sock/sock_rpc.o 00:02:40.426 LIB libspdk_sock.a 00:02:40.426 SO libspdk_sock.so.10.0 00:02:40.426 SYMLINK libspdk_sock.so 00:02:40.994 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:40.994 CC lib/nvme/nvme_ctrlr.o 00:02:40.994 CC lib/nvme/nvme_fabric.o 00:02:40.994 CC lib/nvme/nvme_ns_cmd.o 00:02:40.994 CC lib/nvme/nvme_ns.o 00:02:40.994 CC lib/nvme/nvme_pcie_common.o 00:02:40.994 CC lib/nvme/nvme_pcie.o 00:02:40.994 CC lib/nvme/nvme_qpair.o 00:02:40.994 CC lib/nvme/nvme.o 00:02:40.994 CC lib/nvme/nvme_transport.o 00:02:40.994 CC lib/nvme/nvme_quirks.o 00:02:40.994 CC lib/nvme/nvme_discovery.o 00:02:40.994 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:40.994 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:40.994 CC lib/nvme/nvme_tcp.o 00:02:40.994 CC lib/nvme/nvme_opal.o 00:02:40.994 CC lib/nvme/nvme_io_msg.o 00:02:40.994 CC lib/nvme/nvme_poll_group.o 00:02:40.994 CC lib/nvme/nvme_zns.o 00:02:40.994 CC lib/nvme/nvme_stubs.o 00:02:40.994 CC lib/nvme/nvme_auth.o 00:02:40.994 CC lib/nvme/nvme_cuse.o 00:02:40.994 CC lib/nvme/nvme_rdma.o 00:02:41.613 LIB libspdk_thread.a 00:02:41.613 SO libspdk_thread.so.10.1 00:02:41.613 SYMLINK libspdk_thread.so 00:02:41.877 CC lib/blob/blobstore.o 00:02:41.877 CC lib/blob/zeroes.o 00:02:41.877 CC lib/blob/request.o 00:02:41.877 CC lib/blob/blob_bs_dev.o 00:02:41.877 CC lib/accel/accel.o 00:02:41.877 CC lib/accel/accel_rpc.o 00:02:41.877 CC lib/accel/accel_sw.o 00:02:41.877 CC lib/virtio/virtio.o 00:02:41.877 CC lib/virtio/virtio_vhost_user.o 00:02:41.877 CC lib/virtio/virtio_vfio_user.o 00:02:41.877 CC lib/init/subsystem.o 00:02:41.877 CC lib/init/json_config.o 00:02:41.877 CC lib/virtio/virtio_pci.o 00:02:41.877 CC lib/init/subsystem_rpc.o 00:02:42.133 CC lib/init/rpc.o 00:02:42.390 LIB libspdk_init.a 00:02:42.391 SO libspdk_init.so.5.0 00:02:42.391 LIB libspdk_virtio.a 00:02:42.391 SYMLINK libspdk_init.so 00:02:42.391 SO libspdk_virtio.so.7.0 00:02:42.391 SYMLINK libspdk_virtio.so 00:02:42.647 LIB libspdk_accel.a 00:02:42.647 CC lib/event/app.o 00:02:42.647 CC lib/event/reactor.o 00:02:42.647 CC lib/event/log_rpc.o 00:02:42.647 CC lib/event/app_rpc.o 00:02:42.647 CC lib/event/scheduler_static.o 00:02:42.905 SO libspdk_accel.so.15.1 00:02:42.905 SYMLINK libspdk_accel.so 00:02:43.163 LIB libspdk_event.a 00:02:43.163 CC lib/bdev/bdev.o 00:02:43.163 CC lib/bdev/bdev_rpc.o 00:02:43.163 CC lib/bdev/bdev_zone.o 00:02:43.164 CC lib/bdev/part.o 00:02:43.164 CC lib/bdev/scsi_nvme.o 00:02:43.164 LIB libspdk_nvme.a 00:02:43.164 SO libspdk_event.so.14.0 00:02:43.421 SYMLINK libspdk_event.so 00:02:43.421 SO libspdk_nvme.so.13.1 00:02:43.679 SYMLINK libspdk_nvme.so 00:02:45.053 LIB libspdk_blob.a 00:02:45.053 SO libspdk_blob.so.11.0 00:02:45.053 SYMLINK libspdk_blob.so 00:02:45.619 CC lib/blobfs/blobfs.o 00:02:45.619 CC lib/blobfs/tree.o 00:02:45.619 CC lib/lvol/lvol.o 00:02:45.878 LIB libspdk_bdev.a 00:02:45.878 SO libspdk_bdev.so.15.1 00:02:46.136 SYMLINK libspdk_bdev.so 00:02:46.400 LIB libspdk_blobfs.a 00:02:46.400 SO libspdk_blobfs.so.10.0 00:02:46.400 CC lib/ftl/ftl_core.o 00:02:46.400 CC lib/ftl/ftl_debug.o 00:02:46.400 CC lib/ftl/ftl_init.o 00:02:46.400 CC lib/ftl/ftl_layout.o 00:02:46.400 CC lib/ftl/ftl_sb.o 00:02:46.400 CC lib/ftl/ftl_io.o 00:02:46.400 CC lib/ftl/ftl_l2p.o 00:02:46.400 CC lib/ftl/ftl_l2p_flat.o 00:02:46.400 CC lib/ftl/ftl_band.o 00:02:46.400 CC lib/ftl/ftl_nv_cache.o 00:02:46.400 CC lib/ftl/ftl_band_ops.o 00:02:46.400 CC lib/ftl/ftl_writer.o 00:02:46.400 CC lib/ftl/ftl_rq.o 00:02:46.400 CC lib/ftl/ftl_reloc.o 00:02:46.400 LIB libspdk_lvol.a 00:02:46.400 CC lib/ftl/ftl_l2p_cache.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:46.400 CC lib/ftl/ftl_p2l.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:46.400 CC lib/nbd/nbd.o 00:02:46.400 CC lib/scsi/dev.o 00:02:46.400 CC lib/scsi/lun.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:46.400 CC lib/ublk/ublk.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:46.400 CC lib/scsi/port.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:46.400 CC lib/nbd/nbd_rpc.o 00:02:46.400 CC lib/scsi/scsi.o 00:02:46.400 CC lib/ublk/ublk_rpc.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:46.400 CC lib/nvmf/ctrlr_discovery.o 00:02:46.400 CC lib/scsi/scsi_pr.o 00:02:46.400 CC lib/nvmf/ctrlr.o 00:02:46.400 CC lib/scsi/scsi_bdev.o 00:02:46.400 CC lib/scsi/scsi_rpc.o 00:02:46.400 CC lib/nvmf/ctrlr_bdev.o 00:02:46.400 CC lib/scsi/task.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:46.400 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:46.400 CC lib/nvmf/subsystem.o 00:02:46.401 CC lib/nvmf/nvmf.o 00:02:46.401 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:46.401 CC lib/ftl/utils/ftl_conf.o 00:02:46.401 CC lib/nvmf/nvmf_rpc.o 00:02:46.401 CC lib/nvmf/transport.o 00:02:46.401 CC lib/nvmf/tcp.o 00:02:46.401 CC lib/ftl/utils/ftl_mempool.o 00:02:46.401 CC lib/ftl/utils/ftl_bitmap.o 00:02:46.401 CC lib/nvmf/stubs.o 00:02:46.401 CC lib/nvmf/mdns_server.o 00:02:46.401 CC lib/ftl/utils/ftl_md.o 00:02:46.401 CC lib/ftl/utils/ftl_property.o 00:02:46.401 CC lib/nvmf/rdma.o 00:02:46.401 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:46.401 CC lib/nvmf/auth.o 00:02:46.401 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:46.401 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:46.401 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:46.401 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:46.401 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:46.401 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:46.401 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:46.401 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:46.401 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:46.401 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:46.401 CC lib/ftl/base/ftl_base_dev.o 00:02:46.401 SYMLINK libspdk_blobfs.so 00:02:46.401 CC lib/ftl/base/ftl_base_bdev.o 00:02:46.401 SO libspdk_lvol.so.10.0 00:02:46.661 SYMLINK libspdk_lvol.so 00:02:46.661 CC lib/ftl/ftl_trace.o 00:02:47.229 LIB libspdk_scsi.a 00:02:47.229 SO libspdk_scsi.so.9.0 00:02:47.229 LIB libspdk_ublk.a 00:02:47.229 SO libspdk_ublk.so.3.0 00:02:47.487 SYMLINK libspdk_scsi.so 00:02:47.487 SYMLINK libspdk_ublk.so 00:02:47.487 LIB libspdk_nbd.a 00:02:47.745 SO libspdk_nbd.so.7.0 00:02:47.745 LIB libspdk_ftl.a 00:02:47.745 SYMLINK libspdk_nbd.so 00:02:47.745 CC lib/iscsi/conn.o 00:02:47.745 CC lib/iscsi/iscsi.o 00:02:47.745 CC lib/iscsi/init_grp.o 00:02:47.745 CC lib/iscsi/md5.o 00:02:47.745 CC lib/iscsi/param.o 00:02:47.745 CC lib/iscsi/portal_grp.o 00:02:47.745 CC lib/iscsi/iscsi_subsystem.o 00:02:47.745 CC lib/iscsi/tgt_node.o 00:02:47.745 CC lib/vhost/vhost.o 00:02:47.745 CC lib/vhost/vhost_rpc.o 00:02:47.745 CC lib/iscsi/iscsi_rpc.o 00:02:47.745 CC lib/vhost/vhost_scsi.o 00:02:47.745 CC lib/iscsi/task.o 00:02:47.745 CC lib/vhost/vhost_blk.o 00:02:47.745 CC lib/vhost/rte_vhost_user.o 00:02:48.002 SO libspdk_ftl.so.9.0 00:02:48.260 SYMLINK libspdk_ftl.so 00:02:48.826 LIB libspdk_nvmf.a 00:02:48.826 SO libspdk_nvmf.so.19.0 00:02:48.826 LIB libspdk_vhost.a 00:02:48.826 SO libspdk_vhost.so.8.0 00:02:49.084 SYMLINK libspdk_vhost.so 00:02:49.084 SYMLINK libspdk_nvmf.so 00:02:49.084 LIB libspdk_iscsi.a 00:02:49.342 SO libspdk_iscsi.so.8.0 00:02:49.599 SYMLINK libspdk_iscsi.so 00:02:50.165 CC module/env_dpdk/env_dpdk_rpc.o 00:02:50.165 LIB libspdk_env_dpdk_rpc.a 00:02:50.165 CC module/accel/ioat/accel_ioat.o 00:02:50.165 CC module/sock/posix/posix.o 00:02:50.165 CC module/accel/ioat/accel_ioat_rpc.o 00:02:50.165 CC module/keyring/file/keyring.o 00:02:50.165 CC module/keyring/file/keyring_rpc.o 00:02:50.165 CC module/blob/bdev/blob_bdev.o 00:02:50.165 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:50.165 CC module/accel/dsa/accel_dsa.o 00:02:50.165 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:50.165 CC module/accel/dsa/accel_dsa_rpc.o 00:02:50.165 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:50.165 CC module/accel/error/accel_error_rpc.o 00:02:50.165 CC module/accel/error/accel_error.o 00:02:50.165 CC module/scheduler/gscheduler/gscheduler.o 00:02:50.165 CC module/accel/iaa/accel_iaa_rpc.o 00:02:50.165 CC module/accel/iaa/accel_iaa.o 00:02:50.165 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:50.165 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:50.165 CC module/keyring/linux/keyring.o 00:02:50.165 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:50.165 CC module/keyring/linux/keyring_rpc.o 00:02:50.165 SO libspdk_env_dpdk_rpc.so.6.0 00:02:50.423 SYMLINK libspdk_env_dpdk_rpc.so 00:02:50.423 LIB libspdk_keyring_linux.a 00:02:50.423 LIB libspdk_scheduler_dpdk_governor.a 00:02:50.423 LIB libspdk_keyring_file.a 00:02:50.423 LIB libspdk_scheduler_gscheduler.a 00:02:50.423 LIB libspdk_accel_ioat.a 00:02:50.423 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:50.423 SO libspdk_keyring_linux.so.1.0 00:02:50.423 SO libspdk_scheduler_gscheduler.so.4.0 00:02:50.423 LIB libspdk_accel_error.a 00:02:50.423 SO libspdk_accel_ioat.so.6.0 00:02:50.423 SO libspdk_keyring_file.so.1.0 00:02:50.423 LIB libspdk_accel_iaa.a 00:02:50.423 LIB libspdk_scheduler_dynamic.a 00:02:50.680 LIB libspdk_accel_dsa.a 00:02:50.680 SO libspdk_accel_error.so.2.0 00:02:50.680 LIB libspdk_blob_bdev.a 00:02:50.680 SYMLINK libspdk_keyring_linux.so 00:02:50.680 SO libspdk_accel_iaa.so.3.0 00:02:50.680 SO libspdk_scheduler_dynamic.so.4.0 00:02:50.680 SYMLINK libspdk_scheduler_gscheduler.so 00:02:50.680 SYMLINK libspdk_accel_ioat.so 00:02:50.680 SYMLINK libspdk_keyring_file.so 00:02:50.680 SO libspdk_blob_bdev.so.11.0 00:02:50.680 SO libspdk_accel_dsa.so.5.0 00:02:50.680 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:50.680 SYMLINK libspdk_accel_error.so 00:02:50.680 SYMLINK libspdk_accel_iaa.so 00:02:50.680 SYMLINK libspdk_scheduler_dynamic.so 00:02:50.680 SYMLINK libspdk_blob_bdev.so 00:02:50.680 SYMLINK libspdk_accel_dsa.so 00:02:50.938 LIB libspdk_sock_posix.a 00:02:51.195 SO libspdk_sock_posix.so.6.0 00:02:51.195 SYMLINK libspdk_sock_posix.so 00:02:51.195 CC module/bdev/compress/vbdev_compress.o 00:02:51.195 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:51.195 CC module/bdev/gpt/gpt.o 00:02:51.195 CC module/bdev/error/vbdev_error.o 00:02:51.195 CC module/bdev/error/vbdev_error_rpc.o 00:02:51.195 CC module/bdev/raid/bdev_raid.o 00:02:51.195 CC module/bdev/raid/bdev_raid_rpc.o 00:02:51.195 CC module/bdev/gpt/vbdev_gpt.o 00:02:51.195 CC module/bdev/raid/bdev_raid_sb.o 00:02:51.195 CC module/bdev/raid/raid0.o 00:02:51.195 CC module/bdev/raid/raid1.o 00:02:51.195 CC module/bdev/raid/concat.o 00:02:51.195 CC module/bdev/null/bdev_null_rpc.o 00:02:51.195 CC module/bdev/null/bdev_null.o 00:02:51.195 CC module/bdev/aio/bdev_aio.o 00:02:51.195 CC module/bdev/aio/bdev_aio_rpc.o 00:02:51.195 CC module/bdev/split/vbdev_split.o 00:02:51.195 CC module/bdev/malloc/bdev_malloc.o 00:02:51.196 CC module/bdev/passthru/vbdev_passthru.o 00:02:51.196 CC module/bdev/split/vbdev_split_rpc.o 00:02:51.196 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:51.196 CC module/bdev/nvme/bdev_nvme.o 00:02:51.196 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:51.196 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:51.196 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:51.196 CC module/bdev/crypto/vbdev_crypto.o 00:02:51.196 CC module/bdev/nvme/bdev_mdns_client.o 00:02:51.196 CC module/bdev/nvme/nvme_rpc.o 00:02:51.196 CC module/bdev/nvme/vbdev_opal.o 00:02:51.196 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:51.196 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:51.196 CC module/bdev/lvol/vbdev_lvol.o 00:02:51.196 CC module/bdev/delay/vbdev_delay.o 00:02:51.196 CC module/bdev/iscsi/bdev_iscsi.o 00:02:51.196 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:51.196 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:51.196 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:51.196 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:51.196 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:51.196 CC module/blobfs/bdev/blobfs_bdev.o 00:02:51.196 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:51.196 CC module/bdev/ftl/bdev_ftl.o 00:02:51.196 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:51.196 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:51.196 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:51.196 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:51.454 LIB libspdk_accel_dpdk_compressdev.a 00:02:51.454 LIB libspdk_blobfs_bdev.a 00:02:51.454 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:51.454 SO libspdk_blobfs_bdev.so.6.0 00:02:51.454 LIB libspdk_bdev_error.a 00:02:51.454 LIB libspdk_bdev_passthru.a 00:02:51.454 LIB libspdk_bdev_aio.a 00:02:51.712 SYMLINK libspdk_blobfs_bdev.so 00:02:51.712 LIB libspdk_bdev_null.a 00:02:51.712 SO libspdk_bdev_error.so.6.0 00:02:51.712 SO libspdk_bdev_passthru.so.6.0 00:02:51.712 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:51.712 SO libspdk_bdev_aio.so.6.0 00:02:51.712 LIB libspdk_bdev_crypto.a 00:02:51.712 LIB libspdk_bdev_split.a 00:02:51.712 SO libspdk_bdev_null.so.6.0 00:02:51.712 LIB libspdk_bdev_delay.a 00:02:51.712 SO libspdk_bdev_crypto.so.6.0 00:02:51.712 SYMLINK libspdk_bdev_error.so 00:02:51.712 LIB libspdk_bdev_malloc.a 00:02:51.712 SYMLINK libspdk_bdev_passthru.so 00:02:51.712 SO libspdk_bdev_split.so.6.0 00:02:51.712 SYMLINK libspdk_bdev_aio.so 00:02:51.712 SO libspdk_bdev_delay.so.6.0 00:02:51.712 LIB libspdk_bdev_zone_block.a 00:02:51.712 SO libspdk_bdev_malloc.so.6.0 00:02:51.712 SYMLINK libspdk_bdev_null.so 00:02:51.712 SYMLINK libspdk_bdev_split.so 00:02:51.712 LIB libspdk_bdev_compress.a 00:02:51.712 SO libspdk_bdev_zone_block.so.6.0 00:02:51.712 SYMLINK libspdk_bdev_crypto.so 00:02:51.712 LIB libspdk_accel_dpdk_cryptodev.a 00:02:51.712 SYMLINK libspdk_bdev_delay.so 00:02:51.712 SO libspdk_bdev_compress.so.6.0 00:02:51.712 SYMLINK libspdk_bdev_malloc.so 00:02:51.712 LIB libspdk_bdev_lvol.a 00:02:51.712 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:51.712 SYMLINK libspdk_bdev_zone_block.so 00:02:51.712 LIB libspdk_bdev_iscsi.a 00:02:51.712 SO libspdk_bdev_lvol.so.6.0 00:02:51.712 LIB libspdk_bdev_gpt.a 00:02:51.970 SO libspdk_bdev_iscsi.so.6.0 00:02:51.970 SYMLINK libspdk_bdev_compress.so 00:02:51.970 SO libspdk_bdev_gpt.so.6.0 00:02:51.970 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:51.970 SYMLINK libspdk_bdev_lvol.so 00:02:51.970 LIB libspdk_bdev_virtio.a 00:02:51.970 SYMLINK libspdk_bdev_iscsi.so 00:02:51.970 SYMLINK libspdk_bdev_gpt.so 00:02:51.970 SO libspdk_bdev_virtio.so.6.0 00:02:51.970 LIB libspdk_bdev_ftl.a 00:02:51.970 SO libspdk_bdev_ftl.so.6.0 00:02:51.970 SYMLINK libspdk_bdev_virtio.so 00:02:52.228 SYMLINK libspdk_bdev_ftl.so 00:02:52.228 LIB libspdk_bdev_raid.a 00:02:52.487 SO libspdk_bdev_raid.so.6.0 00:02:52.746 SYMLINK libspdk_bdev_raid.so 00:02:53.685 LIB libspdk_bdev_nvme.a 00:02:53.685 SO libspdk_bdev_nvme.so.7.0 00:02:53.944 SYMLINK libspdk_bdev_nvme.so 00:02:54.512 CC module/event/subsystems/keyring/keyring.o 00:02:54.512 CC module/event/subsystems/iobuf/iobuf.o 00:02:54.512 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:54.512 CC module/event/subsystems/vmd/vmd.o 00:02:54.512 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:54.512 CC module/event/subsystems/sock/sock.o 00:02:54.512 CC module/event/subsystems/scheduler/scheduler.o 00:02:54.512 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:54.769 LIB libspdk_event_vhost_blk.a 00:02:54.769 LIB libspdk_event_iobuf.a 00:02:54.769 LIB libspdk_event_scheduler.a 00:02:54.769 LIB libspdk_event_vmd.a 00:02:54.769 LIB libspdk_event_sock.a 00:02:54.769 SO libspdk_event_vhost_blk.so.3.0 00:02:54.769 SO libspdk_event_scheduler.so.4.0 00:02:54.769 SO libspdk_event_iobuf.so.3.0 00:02:54.769 SO libspdk_event_vmd.so.6.0 00:02:54.769 SO libspdk_event_sock.so.5.0 00:02:54.769 LIB libspdk_event_keyring.a 00:02:54.769 SYMLINK libspdk_event_vhost_blk.so 00:02:54.769 SYMLINK libspdk_event_scheduler.so 00:02:55.028 SYMLINK libspdk_event_iobuf.so 00:02:55.028 SYMLINK libspdk_event_vmd.so 00:02:55.028 SYMLINK libspdk_event_sock.so 00:02:55.028 SO libspdk_event_keyring.so.1.0 00:02:55.028 SYMLINK libspdk_event_keyring.so 00:02:55.287 CC module/event/subsystems/accel/accel.o 00:02:55.545 LIB libspdk_event_accel.a 00:02:55.545 SO libspdk_event_accel.so.6.0 00:02:55.545 SYMLINK libspdk_event_accel.so 00:02:55.804 CC module/event/subsystems/bdev/bdev.o 00:02:56.063 LIB libspdk_event_bdev.a 00:02:56.322 SO libspdk_event_bdev.so.6.0 00:02:56.322 SYMLINK libspdk_event_bdev.so 00:02:56.635 CC module/event/subsystems/scsi/scsi.o 00:02:56.635 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:56.635 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:56.635 CC module/event/subsystems/nbd/nbd.o 00:02:56.635 CC module/event/subsystems/ublk/ublk.o 00:02:56.892 LIB libspdk_event_ublk.a 00:02:56.892 LIB libspdk_event_scsi.a 00:02:56.892 LIB libspdk_event_nbd.a 00:02:56.892 SO libspdk_event_ublk.so.3.0 00:02:56.892 SO libspdk_event_scsi.so.6.0 00:02:56.892 SO libspdk_event_nbd.so.6.0 00:02:56.892 LIB libspdk_event_nvmf.a 00:02:56.892 SYMLINK libspdk_event_ublk.so 00:02:56.892 SO libspdk_event_nvmf.so.6.0 00:02:56.892 SYMLINK libspdk_event_nbd.so 00:02:56.892 SYMLINK libspdk_event_scsi.so 00:02:57.150 SYMLINK libspdk_event_nvmf.so 00:02:57.409 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:57.409 CC module/event/subsystems/iscsi/iscsi.o 00:02:57.668 LIB libspdk_event_vhost_scsi.a 00:02:57.668 LIB libspdk_event_iscsi.a 00:02:57.668 SO libspdk_event_vhost_scsi.so.3.0 00:02:57.668 SO libspdk_event_iscsi.so.6.0 00:02:57.668 SYMLINK libspdk_event_vhost_scsi.so 00:02:57.668 SYMLINK libspdk_event_iscsi.so 00:02:57.927 SO libspdk.so.6.0 00:02:57.927 SYMLINK libspdk.so 00:02:58.186 CXX app/trace/trace.o 00:02:58.186 TEST_HEADER include/spdk/accel.h 00:02:58.186 TEST_HEADER include/spdk/accel_module.h 00:02:58.186 TEST_HEADER include/spdk/assert.h 00:02:58.186 TEST_HEADER include/spdk/base64.h 00:02:58.186 TEST_HEADER include/spdk/bdev.h 00:02:58.186 TEST_HEADER include/spdk/barrier.h 00:02:58.186 CC app/spdk_nvme_discover/discovery_aer.o 00:02:58.186 TEST_HEADER include/spdk/bdev_module.h 00:02:58.186 TEST_HEADER include/spdk/bdev_zone.h 00:02:58.186 TEST_HEADER include/spdk/bit_array.h 00:02:58.186 TEST_HEADER include/spdk/bit_pool.h 00:02:58.186 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:58.186 CC app/trace_record/trace_record.o 00:02:58.186 CC app/spdk_nvme_identify/identify.o 00:02:58.186 TEST_HEADER include/spdk/blobfs.h 00:02:58.186 TEST_HEADER include/spdk/blob_bdev.h 00:02:58.186 CC app/spdk_lspci/spdk_lspci.o 00:02:58.186 TEST_HEADER include/spdk/conf.h 00:02:58.186 TEST_HEADER include/spdk/blob.h 00:02:58.186 TEST_HEADER include/spdk/config.h 00:02:58.186 TEST_HEADER include/spdk/cpuset.h 00:02:58.186 TEST_HEADER include/spdk/crc16.h 00:02:58.186 TEST_HEADER include/spdk/crc64.h 00:02:58.186 TEST_HEADER include/spdk/crc32.h 00:02:58.186 TEST_HEADER include/spdk/dif.h 00:02:58.186 TEST_HEADER include/spdk/dma.h 00:02:58.186 CC app/spdk_top/spdk_top.o 00:02:58.186 TEST_HEADER include/spdk/endian.h 00:02:58.186 TEST_HEADER include/spdk/env_dpdk.h 00:02:58.186 TEST_HEADER include/spdk/env.h 00:02:58.186 TEST_HEADER include/spdk/event.h 00:02:58.186 CC test/rpc_client/rpc_client_test.o 00:02:58.186 TEST_HEADER include/spdk/fd_group.h 00:02:58.186 TEST_HEADER include/spdk/file.h 00:02:58.186 TEST_HEADER include/spdk/ftl.h 00:02:58.186 TEST_HEADER include/spdk/fd.h 00:02:58.186 CC app/spdk_nvme_perf/perf.o 00:02:58.186 TEST_HEADER include/spdk/gpt_spec.h 00:02:58.186 TEST_HEADER include/spdk/hexlify.h 00:02:58.186 TEST_HEADER include/spdk/histogram_data.h 00:02:58.186 TEST_HEADER include/spdk/idxd.h 00:02:58.186 TEST_HEADER include/spdk/idxd_spec.h 00:02:58.186 TEST_HEADER include/spdk/ioat.h 00:02:58.186 TEST_HEADER include/spdk/ioat_spec.h 00:02:58.186 TEST_HEADER include/spdk/init.h 00:02:58.186 TEST_HEADER include/spdk/iscsi_spec.h 00:02:58.186 TEST_HEADER include/spdk/json.h 00:02:58.186 TEST_HEADER include/spdk/jsonrpc.h 00:02:58.186 TEST_HEADER include/spdk/keyring.h 00:02:58.186 TEST_HEADER include/spdk/keyring_module.h 00:02:58.186 TEST_HEADER include/spdk/log.h 00:02:58.186 TEST_HEADER include/spdk/likely.h 00:02:58.186 TEST_HEADER include/spdk/lvol.h 00:02:58.186 TEST_HEADER include/spdk/memory.h 00:02:58.186 TEST_HEADER include/spdk/mmio.h 00:02:58.186 TEST_HEADER include/spdk/nbd.h 00:02:58.186 TEST_HEADER include/spdk/net.h 00:02:58.186 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:58.186 TEST_HEADER include/spdk/notify.h 00:02:58.454 TEST_HEADER include/spdk/nvme.h 00:02:58.454 TEST_HEADER include/spdk/nvme_intel.h 00:02:58.454 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:58.454 CC app/spdk_dd/spdk_dd.o 00:02:58.454 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:58.454 TEST_HEADER include/spdk/nvme_spec.h 00:02:58.454 TEST_HEADER include/spdk/nvme_zns.h 00:02:58.454 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:58.454 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:58.454 TEST_HEADER include/spdk/nvmf_spec.h 00:02:58.454 TEST_HEADER include/spdk/nvmf.h 00:02:58.454 TEST_HEADER include/spdk/nvmf_transport.h 00:02:58.454 TEST_HEADER include/spdk/opal_spec.h 00:02:58.454 TEST_HEADER include/spdk/pci_ids.h 00:02:58.454 TEST_HEADER include/spdk/opal.h 00:02:58.454 TEST_HEADER include/spdk/pipe.h 00:02:58.454 TEST_HEADER include/spdk/queue.h 00:02:58.454 TEST_HEADER include/spdk/reduce.h 00:02:58.454 TEST_HEADER include/spdk/rpc.h 00:02:58.454 TEST_HEADER include/spdk/scheduler.h 00:02:58.454 TEST_HEADER include/spdk/scsi.h 00:02:58.454 TEST_HEADER include/spdk/sock.h 00:02:58.454 TEST_HEADER include/spdk/stdinc.h 00:02:58.454 TEST_HEADER include/spdk/string.h 00:02:58.454 TEST_HEADER include/spdk/scsi_spec.h 00:02:58.454 TEST_HEADER include/spdk/thread.h 00:02:58.454 TEST_HEADER include/spdk/trace_parser.h 00:02:58.454 TEST_HEADER include/spdk/tree.h 00:02:58.454 TEST_HEADER include/spdk/ublk.h 00:02:58.454 TEST_HEADER include/spdk/util.h 00:02:58.454 TEST_HEADER include/spdk/trace.h 00:02:58.454 TEST_HEADER include/spdk/uuid.h 00:02:58.454 TEST_HEADER include/spdk/version.h 00:02:58.454 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:58.454 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:58.454 TEST_HEADER include/spdk/vhost.h 00:02:58.454 TEST_HEADER include/spdk/vmd.h 00:02:58.454 TEST_HEADER include/spdk/xor.h 00:02:58.454 TEST_HEADER include/spdk/zipf.h 00:02:58.454 CXX test/cpp_headers/accel.o 00:02:58.454 CXX test/cpp_headers/accel_module.o 00:02:58.454 CXX test/cpp_headers/assert.o 00:02:58.454 CXX test/cpp_headers/barrier.o 00:02:58.454 CXX test/cpp_headers/base64.o 00:02:58.454 CXX test/cpp_headers/bdev.o 00:02:58.454 CXX test/cpp_headers/bdev_zone.o 00:02:58.454 CXX test/cpp_headers/bdev_module.o 00:02:58.454 CC app/iscsi_tgt/iscsi_tgt.o 00:02:58.454 CXX test/cpp_headers/bit_array.o 00:02:58.454 CXX test/cpp_headers/bit_pool.o 00:02:58.454 CXX test/cpp_headers/blob_bdev.o 00:02:58.454 CXX test/cpp_headers/blobfs_bdev.o 00:02:58.454 CXX test/cpp_headers/blob.o 00:02:58.454 CXX test/cpp_headers/blobfs.o 00:02:58.454 CXX test/cpp_headers/config.o 00:02:58.454 CXX test/cpp_headers/conf.o 00:02:58.454 CXX test/cpp_headers/cpuset.o 00:02:58.454 CXX test/cpp_headers/crc32.o 00:02:58.454 CXX test/cpp_headers/crc16.o 00:02:58.454 CXX test/cpp_headers/crc64.o 00:02:58.454 CXX test/cpp_headers/dif.o 00:02:58.454 CXX test/cpp_headers/endian.o 00:02:58.454 CXX test/cpp_headers/env_dpdk.o 00:02:58.454 CXX test/cpp_headers/dma.o 00:02:58.454 CXX test/cpp_headers/env.o 00:02:58.454 CXX test/cpp_headers/fd_group.o 00:02:58.454 CXX test/cpp_headers/event.o 00:02:58.454 CXX test/cpp_headers/fd.o 00:02:58.454 CXX test/cpp_headers/file.o 00:02:58.454 CXX test/cpp_headers/ftl.o 00:02:58.454 CXX test/cpp_headers/gpt_spec.o 00:02:58.454 CXX test/cpp_headers/hexlify.o 00:02:58.454 CXX test/cpp_headers/histogram_data.o 00:02:58.454 CXX test/cpp_headers/idxd.o 00:02:58.454 CXX test/cpp_headers/idxd_spec.o 00:02:58.454 CXX test/cpp_headers/ioat.o 00:02:58.454 CXX test/cpp_headers/init.o 00:02:58.454 CXX test/cpp_headers/ioat_spec.o 00:02:58.454 CC app/nvmf_tgt/nvmf_main.o 00:02:58.454 CXX test/cpp_headers/iscsi_spec.o 00:02:58.454 CXX test/cpp_headers/json.o 00:02:58.454 CXX test/cpp_headers/jsonrpc.o 00:02:58.454 CXX test/cpp_headers/keyring.o 00:02:58.454 CC app/spdk_tgt/spdk_tgt.o 00:02:58.454 CXX test/cpp_headers/keyring_module.o 00:02:58.454 CC test/app/stub/stub.o 00:02:58.454 CC app/fio/nvme/fio_plugin.o 00:02:58.454 CC test/thread/poller_perf/poller_perf.o 00:02:58.454 CC examples/ioat/verify/verify.o 00:02:58.454 CC examples/util/zipf/zipf.o 00:02:58.454 CC examples/ioat/perf/perf.o 00:02:58.454 CC test/app/jsoncat/jsoncat.o 00:02:58.454 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:58.454 CC test/app/histogram_perf/histogram_perf.o 00:02:58.454 CC test/env/memory/memory_ut.o 00:02:58.714 CC test/env/vtophys/vtophys.o 00:02:58.714 LINK spdk_lspci 00:02:58.714 CC test/app/bdev_svc/bdev_svc.o 00:02:58.714 CC app/fio/bdev/fio_plugin.o 00:02:58.714 CC test/dma/test_dma/test_dma.o 00:02:58.714 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:58.714 CC test/env/pci/pci_ut.o 00:02:58.714 LINK interrupt_tgt 00:02:58.714 CC test/env/mem_callbacks/mem_callbacks.o 00:02:58.714 CXX test/cpp_headers/likely.o 00:02:58.714 LINK spdk_trace_record 00:02:58.714 LINK rpc_client_test 00:02:58.714 CXX test/cpp_headers/log.o 00:02:58.714 CXX test/cpp_headers/lvol.o 00:02:58.978 CXX test/cpp_headers/memory.o 00:02:58.978 LINK spdk_nvme_discover 00:02:58.978 LINK jsoncat 00:02:58.978 CXX test/cpp_headers/mmio.o 00:02:58.978 CXX test/cpp_headers/nbd.o 00:02:58.978 LINK histogram_perf 00:02:58.978 CXX test/cpp_headers/net.o 00:02:58.978 CXX test/cpp_headers/nvme.o 00:02:58.978 CXX test/cpp_headers/nvme_intel.o 00:02:58.978 CXX test/cpp_headers/notify.o 00:02:58.978 CXX test/cpp_headers/nvme_ocssd.o 00:02:58.978 LINK zipf 00:02:58.978 CXX test/cpp_headers/nvme_spec.o 00:02:58.978 LINK iscsi_tgt 00:02:58.978 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:58.978 CXX test/cpp_headers/nvme_zns.o 00:02:58.978 CXX test/cpp_headers/nvmf_cmd.o 00:02:58.978 LINK poller_perf 00:02:58.978 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:58.978 CXX test/cpp_headers/nvmf.o 00:02:58.978 LINK nvmf_tgt 00:02:58.978 CXX test/cpp_headers/nvmf_spec.o 00:02:58.978 LINK ioat_perf 00:02:58.978 LINK verify 00:02:58.978 CXX test/cpp_headers/nvmf_transport.o 00:02:58.978 LINK stub 00:02:58.978 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:58.978 CXX test/cpp_headers/opal.o 00:02:58.978 CXX test/cpp_headers/opal_spec.o 00:02:58.978 LINK vtophys 00:02:58.978 CXX test/cpp_headers/pci_ids.o 00:02:58.978 LINK spdk_tgt 00:02:58.978 CXX test/cpp_headers/pipe.o 00:02:58.978 CXX test/cpp_headers/queue.o 00:02:58.978 LINK env_dpdk_post_init 00:02:59.237 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:59.238 CXX test/cpp_headers/reduce.o 00:02:59.238 CXX test/cpp_headers/rpc.o 00:02:59.238 CXX test/cpp_headers/scheduler.o 00:02:59.238 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:59.238 CXX test/cpp_headers/scsi.o 00:02:59.238 CXX test/cpp_headers/sock.o 00:02:59.238 CXX test/cpp_headers/stdinc.o 00:02:59.238 CXX test/cpp_headers/scsi_spec.o 00:02:59.238 CXX test/cpp_headers/string.o 00:02:59.238 CXX test/cpp_headers/thread.o 00:02:59.238 CXX test/cpp_headers/trace.o 00:02:59.238 CXX test/cpp_headers/trace_parser.o 00:02:59.238 CXX test/cpp_headers/tree.o 00:02:59.238 CXX test/cpp_headers/ublk.o 00:02:59.238 CXX test/cpp_headers/util.o 00:02:59.238 CXX test/cpp_headers/version.o 00:02:59.238 CXX test/cpp_headers/uuid.o 00:02:59.238 CXX test/cpp_headers/vfio_user_pci.o 00:02:59.238 LINK bdev_svc 00:02:59.238 CXX test/cpp_headers/vhost.o 00:02:59.238 CXX test/cpp_headers/vfio_user_spec.o 00:02:59.238 CXX test/cpp_headers/vmd.o 00:02:59.238 CXX test/cpp_headers/xor.o 00:02:59.238 CXX test/cpp_headers/zipf.o 00:02:59.238 LINK spdk_dd 00:02:59.495 LINK spdk_trace 00:02:59.495 LINK spdk_nvme 00:02:59.495 LINK nvme_fuzz 00:02:59.495 LINK test_dma 00:02:59.495 LINK spdk_bdev 00:02:59.495 CC examples/vmd/lsvmd/lsvmd.o 00:02:59.495 CC examples/idxd/perf/perf.o 00:02:59.495 CC examples/vmd/led/led.o 00:02:59.495 LINK pci_ut 00:02:59.495 CC examples/sock/hello_world/hello_sock.o 00:02:59.755 CC test/event/reactor_perf/reactor_perf.o 00:02:59.755 CC test/event/event_perf/event_perf.o 00:02:59.755 CC test/event/reactor/reactor.o 00:02:59.755 CC test/event/app_repeat/app_repeat.o 00:02:59.755 CC examples/thread/thread/thread_ex.o 00:02:59.755 LINK mem_callbacks 00:02:59.755 CC test/event/scheduler/scheduler.o 00:02:59.755 LINK spdk_nvme_perf 00:02:59.755 LINK spdk_nvme_identify 00:02:59.755 LINK led 00:02:59.755 LINK reactor 00:02:59.755 LINK lsvmd 00:02:59.755 LINK event_perf 00:02:59.755 LINK reactor_perf 00:02:59.755 LINK spdk_top 00:02:59.755 LINK hello_sock 00:02:59.755 LINK app_repeat 00:02:59.755 LINK vhost_fuzz 00:03:00.012 CC app/vhost/vhost.o 00:03:00.012 LINK thread 00:03:00.012 LINK idxd_perf 00:03:00.012 LINK scheduler 00:03:00.012 CC test/nvme/reserve/reserve.o 00:03:00.012 CC test/nvme/reset/reset.o 00:03:00.012 CC test/nvme/err_injection/err_injection.o 00:03:00.012 CC test/nvme/compliance/nvme_compliance.o 00:03:00.012 CC test/nvme/sgl/sgl.o 00:03:00.012 CC test/nvme/fdp/fdp.o 00:03:00.012 CC test/nvme/connect_stress/connect_stress.o 00:03:00.269 CC test/nvme/simple_copy/simple_copy.o 00:03:00.269 CC test/nvme/fused_ordering/fused_ordering.o 00:03:00.269 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:00.269 LINK vhost 00:03:00.269 CC test/nvme/boot_partition/boot_partition.o 00:03:00.269 CC test/nvme/cuse/cuse.o 00:03:00.269 CC test/nvme/overhead/overhead.o 00:03:00.269 CC test/nvme/aer/aer.o 00:03:00.269 CC test/nvme/startup/startup.o 00:03:00.269 CC test/nvme/e2edp/nvme_dp.o 00:03:00.269 CC test/accel/dif/dif.o 00:03:00.269 CC test/blobfs/mkfs/mkfs.o 00:03:00.269 LINK memory_ut 00:03:00.269 CC test/lvol/esnap/esnap.o 00:03:00.269 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:00.269 CC examples/nvme/abort/abort.o 00:03:00.269 LINK fused_ordering 00:03:00.269 CC examples/nvme/hotplug/hotplug.o 00:03:00.269 CC examples/nvme/reconnect/reconnect.o 00:03:00.269 CC examples/nvme/arbitration/arbitration.o 00:03:00.269 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:00.269 CC examples/nvme/hello_world/hello_world.o 00:03:00.269 LINK connect_stress 00:03:00.269 LINK err_injection 00:03:00.269 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:00.269 LINK doorbell_aers 00:03:00.270 LINK reset 00:03:00.270 LINK reserve 00:03:00.528 LINK overhead 00:03:00.528 LINK sgl 00:03:00.528 LINK simple_copy 00:03:00.528 LINK aer 00:03:00.528 LINK nvme_dp 00:03:00.528 LINK mkfs 00:03:00.528 CC examples/accel/perf/accel_perf.o 00:03:00.528 LINK fdp 00:03:00.528 LINK boot_partition 00:03:00.528 LINK startup 00:03:00.528 LINK nvme_compliance 00:03:00.528 CC examples/blob/cli/blobcli.o 00:03:00.528 CC examples/blob/hello_world/hello_blob.o 00:03:00.528 LINK hotplug 00:03:00.528 LINK cmb_copy 00:03:00.528 LINK pmr_persistence 00:03:00.528 LINK hello_world 00:03:00.786 LINK arbitration 00:03:00.786 LINK reconnect 00:03:00.786 LINK dif 00:03:00.786 LINK abort 00:03:00.786 LINK nvme_manage 00:03:01.045 LINK accel_perf 00:03:01.045 LINK hello_blob 00:03:01.045 LINK iscsi_fuzz 00:03:01.045 LINK blobcli 00:03:01.303 CC test/bdev/bdevio/bdevio.o 00:03:01.561 LINK cuse 00:03:01.561 CC examples/bdev/bdevperf/bdevperf.o 00:03:01.822 CC examples/bdev/hello_world/hello_bdev.o 00:03:01.822 LINK bdevio 00:03:02.081 LINK hello_bdev 00:03:02.648 LINK bdevperf 00:03:03.581 CC examples/nvmf/nvmf/nvmf.o 00:03:04.148 LINK nvmf 00:03:05.525 LINK esnap 00:03:05.784 00:03:05.784 real 1m36.323s 00:03:05.784 user 18m12.950s 00:03:05.784 sys 4m24.465s 00:03:05.784 23:57:52 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:03:05.784 23:57:52 make -- common/autotest_common.sh@10 -- $ set +x 00:03:05.784 ************************************ 00:03:05.784 END TEST make 00:03:05.784 ************************************ 00:03:05.784 23:57:52 -- common/autotest_common.sh@1142 -- $ return 0 00:03:05.784 23:57:52 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:05.784 23:57:52 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:05.784 23:57:52 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:05.784 23:57:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.784 23:57:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:05.784 23:57:52 -- pm/common@44 -- $ pid=3330812 00:03:05.784 23:57:52 -- pm/common@50 -- $ kill -TERM 3330812 00:03:05.784 23:57:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.784 23:57:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:05.784 23:57:52 -- pm/common@44 -- $ pid=3330814 00:03:05.784 23:57:52 -- pm/common@50 -- $ kill -TERM 3330814 00:03:05.784 23:57:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.784 23:57:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:05.784 23:57:52 -- pm/common@44 -- $ pid=3330816 00:03:05.784 23:57:52 -- pm/common@50 -- $ kill -TERM 3330816 00:03:05.784 23:57:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.784 23:57:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:05.784 23:57:52 -- pm/common@44 -- $ pid=3330844 00:03:05.784 23:57:52 -- pm/common@50 -- $ sudo -E kill -TERM 3330844 00:03:06.042 23:57:52 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:06.042 23:57:52 -- nvmf/common.sh@7 -- # uname -s 00:03:06.042 23:57:52 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:06.043 23:57:52 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:06.043 23:57:52 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:06.043 23:57:52 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:06.043 23:57:52 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:06.043 23:57:52 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:06.043 23:57:52 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:06.043 23:57:52 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:06.043 23:57:52 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:06.043 23:57:52 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:06.043 23:57:52 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:03:06.043 23:57:52 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:03:06.043 23:57:52 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:06.043 23:57:52 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:06.043 23:57:52 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:06.043 23:57:52 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:06.043 23:57:52 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:06.043 23:57:52 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:06.043 23:57:52 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:06.043 23:57:52 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:06.043 23:57:52 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:06.043 23:57:52 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:06.043 23:57:52 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:06.043 23:57:52 -- paths/export.sh@5 -- # export PATH 00:03:06.043 23:57:52 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:06.043 23:57:52 -- nvmf/common.sh@47 -- # : 0 00:03:06.043 23:57:52 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:06.043 23:57:52 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:06.043 23:57:52 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:06.043 23:57:52 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:06.043 23:57:52 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:06.043 23:57:52 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:06.043 23:57:52 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:06.043 23:57:52 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:06.043 23:57:52 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:06.043 23:57:52 -- spdk/autotest.sh@32 -- # uname -s 00:03:06.043 23:57:52 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:06.043 23:57:52 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:06.043 23:57:52 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:06.043 23:57:52 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:06.043 23:57:52 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:06.043 23:57:52 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:06.043 23:57:52 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:06.043 23:57:52 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:06.043 23:57:52 -- spdk/autotest.sh@48 -- # udevadm_pid=3398224 00:03:06.043 23:57:52 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:06.043 23:57:52 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:06.043 23:57:52 -- pm/common@17 -- # local monitor 00:03:06.043 23:57:52 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:06.043 23:57:52 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:06.043 23:57:52 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:06.043 23:57:52 -- pm/common@21 -- # date +%s 00:03:06.043 23:57:52 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:06.043 23:57:52 -- pm/common@21 -- # date +%s 00:03:06.043 23:57:52 -- pm/common@25 -- # sleep 1 00:03:06.043 23:57:52 -- pm/common@21 -- # date +%s 00:03:06.043 23:57:52 -- pm/common@21 -- # date +%s 00:03:06.043 23:57:52 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721080672 00:03:06.043 23:57:52 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721080672 00:03:06.043 23:57:52 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721080672 00:03:06.043 23:57:52 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721080672 00:03:06.043 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721080672_collect-vmstat.pm.log 00:03:06.043 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721080672_collect-cpu-load.pm.log 00:03:06.043 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721080672_collect-cpu-temp.pm.log 00:03:06.043 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721080672_collect-bmc-pm.bmc.pm.log 00:03:06.977 23:57:53 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:06.977 23:57:53 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:06.977 23:57:53 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:06.977 23:57:53 -- common/autotest_common.sh@10 -- # set +x 00:03:06.977 23:57:53 -- spdk/autotest.sh@59 -- # create_test_list 00:03:06.978 23:57:53 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:06.978 23:57:53 -- common/autotest_common.sh@10 -- # set +x 00:03:06.978 23:57:53 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:06.978 23:57:53 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:06.978 23:57:53 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:06.978 23:57:53 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:06.978 23:57:53 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:06.978 23:57:53 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:06.978 23:57:53 -- common/autotest_common.sh@1455 -- # uname 00:03:07.236 23:57:53 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:07.236 23:57:53 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:07.236 23:57:53 -- common/autotest_common.sh@1475 -- # uname 00:03:07.236 23:57:53 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:07.236 23:57:53 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:07.236 23:57:53 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:07.236 23:57:53 -- spdk/autotest.sh@72 -- # hash lcov 00:03:07.236 23:57:53 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:07.236 23:57:53 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:07.236 --rc lcov_branch_coverage=1 00:03:07.236 --rc lcov_function_coverage=1 00:03:07.236 --rc genhtml_branch_coverage=1 00:03:07.236 --rc genhtml_function_coverage=1 00:03:07.236 --rc genhtml_legend=1 00:03:07.236 --rc geninfo_all_blocks=1 00:03:07.236 ' 00:03:07.236 23:57:53 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:07.236 --rc lcov_branch_coverage=1 00:03:07.236 --rc lcov_function_coverage=1 00:03:07.236 --rc genhtml_branch_coverage=1 00:03:07.236 --rc genhtml_function_coverage=1 00:03:07.236 --rc genhtml_legend=1 00:03:07.236 --rc geninfo_all_blocks=1 00:03:07.236 ' 00:03:07.236 23:57:53 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:07.236 --rc lcov_branch_coverage=1 00:03:07.236 --rc lcov_function_coverage=1 00:03:07.236 --rc genhtml_branch_coverage=1 00:03:07.236 --rc genhtml_function_coverage=1 00:03:07.236 --rc genhtml_legend=1 00:03:07.236 --rc geninfo_all_blocks=1 00:03:07.236 --no-external' 00:03:07.236 23:57:53 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:07.236 --rc lcov_branch_coverage=1 00:03:07.236 --rc lcov_function_coverage=1 00:03:07.236 --rc genhtml_branch_coverage=1 00:03:07.236 --rc genhtml_function_coverage=1 00:03:07.236 --rc genhtml_legend=1 00:03:07.236 --rc geninfo_all_blocks=1 00:03:07.236 --no-external' 00:03:07.236 23:57:53 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:07.236 lcov: LCOV version 1.14 00:03:07.236 23:57:54 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:19.439 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:19.439 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:29.494 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:29.494 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:29.495 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:29.495 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:34.768 23:58:20 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:34.768 23:58:20 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:34.768 23:58:20 -- common/autotest_common.sh@10 -- # set +x 00:03:34.768 23:58:20 -- spdk/autotest.sh@91 -- # rm -f 00:03:34.768 23:58:20 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:38.054 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:38.054 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:38.054 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:03:38.054 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:38.054 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:38.054 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:38.054 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:38.054 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:38.054 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:38.054 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:38.055 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:38.055 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:38.055 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:38.055 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:38.055 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:38.055 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:38.055 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:38.055 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:38.055 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:38.055 23:58:24 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:38.055 23:58:24 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:38.055 23:58:24 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:38.055 23:58:24 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:38.055 23:58:24 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:38.055 23:58:24 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:38.055 23:58:24 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:38.055 23:58:24 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:38.055 23:58:24 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:38.055 23:58:24 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:38.055 23:58:24 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:38.055 23:58:24 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:38.055 23:58:24 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:38.055 23:58:24 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:38.055 23:58:24 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:38.055 No valid GPT data, bailing 00:03:38.055 23:58:24 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:38.055 23:58:24 -- scripts/common.sh@391 -- # pt= 00:03:38.055 23:58:24 -- scripts/common.sh@392 -- # return 1 00:03:38.055 23:58:24 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:38.055 1+0 records in 00:03:38.055 1+0 records out 00:03:38.055 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00236585 s, 443 MB/s 00:03:38.055 23:58:24 -- spdk/autotest.sh@118 -- # sync 00:03:38.055 23:58:24 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:38.055 23:58:24 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:38.055 23:58:24 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:43.326 23:58:30 -- spdk/autotest.sh@124 -- # uname -s 00:03:43.326 23:58:30 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:43.326 23:58:30 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:43.326 23:58:30 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:43.326 23:58:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:43.326 23:58:30 -- common/autotest_common.sh@10 -- # set +x 00:03:43.326 ************************************ 00:03:43.326 START TEST setup.sh 00:03:43.326 ************************************ 00:03:43.326 23:58:30 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:43.326 * Looking for test storage... 00:03:43.326 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:43.326 23:58:30 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:43.326 23:58:30 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:43.326 23:58:30 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:43.326 23:58:30 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:43.326 23:58:30 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:43.326 23:58:30 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:43.326 ************************************ 00:03:43.326 START TEST acl 00:03:43.326 ************************************ 00:03:43.326 23:58:30 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:43.585 * Looking for test storage... 00:03:43.585 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:43.585 23:58:30 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:43.585 23:58:30 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:43.585 23:58:30 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:43.585 23:58:30 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:43.585 23:58:30 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:43.585 23:58:30 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:43.585 23:58:30 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:43.585 23:58:30 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:43.585 23:58:30 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:43.585 23:58:30 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:43.585 23:58:30 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:43.585 23:58:30 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:43.585 23:58:30 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:43.585 23:58:30 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:43.585 23:58:30 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:43.585 23:58:30 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:47.774 23:58:34 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:47.774 23:58:34 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:47.774 23:58:34 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.774 23:58:34 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:47.774 23:58:34 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:47.774 23:58:34 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 Hugepages 00:03:51.962 node hugesize free / total 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 00:03:51.962 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:51.962 23:58:38 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:51.962 23:58:38 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:51.962 23:58:38 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:51.962 23:58:38 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:51.962 ************************************ 00:03:51.962 START TEST denied 00:03:51.962 ************************************ 00:03:51.962 23:58:38 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:03:51.962 23:58:38 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:03:51.962 23:58:38 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:51.962 23:58:38 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:03:51.962 23:58:38 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:51.962 23:58:38 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:56.190 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:03:56.190 23:58:42 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:03:56.190 23:58:42 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:56.190 23:58:42 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:56.190 23:58:42 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:03:56.190 23:58:42 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:03:56.190 23:58:42 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:56.190 23:58:42 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:56.190 23:58:42 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:56.190 23:58:42 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:56.190 23:58:42 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:01.463 00:04:01.463 real 0m9.313s 00:04:01.463 user 0m3.033s 00:04:01.463 sys 0m5.591s 00:04:01.463 23:58:47 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:01.463 23:58:47 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:01.463 ************************************ 00:04:01.463 END TEST denied 00:04:01.463 ************************************ 00:04:01.463 23:58:47 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:01.463 23:58:47 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:01.463 23:58:47 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:01.463 23:58:47 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:01.463 23:58:47 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:01.463 ************************************ 00:04:01.463 START TEST allowed 00:04:01.463 ************************************ 00:04:01.463 23:58:47 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:04:01.463 23:58:47 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:04:01.463 23:58:47 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:01.463 23:58:47 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:04:01.464 23:58:47 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:01.464 23:58:47 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:08.035 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:08.035 23:58:54 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:08.035 23:58:54 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:08.035 23:58:54 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:08.035 23:58:54 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:08.035 23:58:54 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:12.227 00:04:12.227 real 0m10.439s 00:04:12.227 user 0m2.680s 00:04:12.227 sys 0m5.233s 00:04:12.227 23:58:58 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:12.227 23:58:58 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:12.227 ************************************ 00:04:12.227 END TEST allowed 00:04:12.227 ************************************ 00:04:12.227 23:58:58 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:12.227 00:04:12.227 real 0m28.141s 00:04:12.227 user 0m8.720s 00:04:12.227 sys 0m16.523s 00:04:12.227 23:58:58 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:12.227 23:58:58 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:12.227 ************************************ 00:04:12.227 END TEST acl 00:04:12.227 ************************************ 00:04:12.227 23:58:58 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:12.227 23:58:58 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:12.227 23:58:58 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:12.227 23:58:58 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:12.227 23:58:58 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:12.227 ************************************ 00:04:12.227 START TEST hugepages 00:04:12.227 ************************************ 00:04:12.227 23:58:58 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:12.227 * Looking for test storage... 00:04:12.227 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.227 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 76579032 kB' 'MemAvailable: 79878688 kB' 'Buffers: 12176 kB' 'Cached: 9619248 kB' 'SwapCached: 0 kB' 'Active: 6679952 kB' 'Inactive: 3456260 kB' 'Active(anon): 6286368 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508156 kB' 'Mapped: 187908 kB' 'Shmem: 5781580 kB' 'KReclaimable: 206912 kB' 'Slab: 541680 kB' 'SReclaimable: 206912 kB' 'SUnreclaim: 334768 kB' 'KernelStack: 16176 kB' 'PageTables: 8440 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438188 kB' 'Committed_AS: 7704444 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:12.228 23:58:58 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:12.228 23:58:58 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:12.228 23:58:58 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:12.228 23:58:58 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:12.228 ************************************ 00:04:12.228 START TEST default_setup 00:04:12.228 ************************************ 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:12.228 23:58:58 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:15.517 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:15.517 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:15.517 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:15.517 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:15.517 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:15.517 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:15.517 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:15.517 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:15.776 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:15.776 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:15.776 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:15.776 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:15.776 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:15.776 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:15.776 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:15.776 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:15.776 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:15.776 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:18.313 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:18.313 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:18.313 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:18.313 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:18.313 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:18.313 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:18.313 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:18.313 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:18.313 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:18.313 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:18.313 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78719124 kB' 'MemAvailable: 82018668 kB' 'Buffers: 12176 kB' 'Cached: 9619372 kB' 'SwapCached: 0 kB' 'Active: 6696480 kB' 'Inactive: 3456260 kB' 'Active(anon): 6302896 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524028 kB' 'Mapped: 188096 kB' 'Shmem: 5781704 kB' 'KReclaimable: 206688 kB' 'Slab: 540136 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333448 kB' 'KernelStack: 16464 kB' 'PageTables: 8924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7723372 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201112 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.314 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78718836 kB' 'MemAvailable: 82018380 kB' 'Buffers: 12176 kB' 'Cached: 9619376 kB' 'SwapCached: 0 kB' 'Active: 6695636 kB' 'Inactive: 3456260 kB' 'Active(anon): 6302052 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523528 kB' 'Mapped: 188004 kB' 'Shmem: 5781708 kB' 'KReclaimable: 206688 kB' 'Slab: 540320 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333632 kB' 'KernelStack: 16208 kB' 'PageTables: 8524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7724884 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.315 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78718184 kB' 'MemAvailable: 82017728 kB' 'Buffers: 12176 kB' 'Cached: 9619392 kB' 'SwapCached: 0 kB' 'Active: 6695700 kB' 'Inactive: 3456260 kB' 'Active(anon): 6302116 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523548 kB' 'Mapped: 188004 kB' 'Shmem: 5781724 kB' 'KReclaimable: 206688 kB' 'Slab: 540320 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333632 kB' 'KernelStack: 16384 kB' 'PageTables: 8496 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7724344 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.316 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.579 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:18.580 nr_hugepages=1024 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:18.580 resv_hugepages=0 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:18.580 surplus_hugepages=0 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:18.580 anon_hugepages=0 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78714884 kB' 'MemAvailable: 82014428 kB' 'Buffers: 12176 kB' 'Cached: 9619416 kB' 'SwapCached: 0 kB' 'Active: 6699864 kB' 'Inactive: 3456260 kB' 'Active(anon): 6306280 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527748 kB' 'Mapped: 188508 kB' 'Shmem: 5781748 kB' 'KReclaimable: 206688 kB' 'Slab: 540320 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333632 kB' 'KernelStack: 16336 kB' 'PageTables: 8740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7729716 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.580 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.581 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36524224 kB' 'MemUsed: 11592716 kB' 'SwapCached: 0 kB' 'Active: 5348388 kB' 'Inactive: 3372048 kB' 'Active(anon): 5190484 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8481468 kB' 'Mapped: 88064 kB' 'AnonPages: 242120 kB' 'Shmem: 4951516 kB' 'KernelStack: 9048 kB' 'PageTables: 4624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126424 kB' 'Slab: 338060 kB' 'SReclaimable: 126424 kB' 'SUnreclaim: 211636 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.582 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:18.583 node0=1024 expecting 1024 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:18.583 00:04:18.583 real 0m6.696s 00:04:18.583 user 0m1.680s 00:04:18.583 sys 0m2.753s 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:18.583 23:59:05 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:18.583 ************************************ 00:04:18.583 END TEST default_setup 00:04:18.583 ************************************ 00:04:18.583 23:59:05 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:18.583 23:59:05 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:18.583 23:59:05 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:18.583 23:59:05 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:18.583 23:59:05 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:18.583 ************************************ 00:04:18.583 START TEST per_node_1G_alloc 00:04:18.583 ************************************ 00:04:18.583 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:18.583 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:18.583 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:18.583 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:18.583 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:18.583 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:18.583 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:18.584 23:59:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:22.820 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:22.820 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:22.820 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:22.820 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:22.820 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.820 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78726284 kB' 'MemAvailable: 82025828 kB' 'Buffers: 12176 kB' 'Cached: 9619504 kB' 'SwapCached: 0 kB' 'Active: 6693492 kB' 'Inactive: 3456260 kB' 'Active(anon): 6299908 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521184 kB' 'Mapped: 187016 kB' 'Shmem: 5781836 kB' 'KReclaimable: 206688 kB' 'Slab: 540352 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333664 kB' 'KernelStack: 16096 kB' 'PageTables: 7936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7714660 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.821 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78726684 kB' 'MemAvailable: 82026228 kB' 'Buffers: 12176 kB' 'Cached: 9619516 kB' 'SwapCached: 0 kB' 'Active: 6694108 kB' 'Inactive: 3456260 kB' 'Active(anon): 6300524 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521908 kB' 'Mapped: 186968 kB' 'Shmem: 5781848 kB' 'KReclaimable: 206688 kB' 'Slab: 540416 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333728 kB' 'KernelStack: 16160 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7716544 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200840 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.822 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.823 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78727504 kB' 'MemAvailable: 82027048 kB' 'Buffers: 12176 kB' 'Cached: 9619532 kB' 'SwapCached: 0 kB' 'Active: 6694076 kB' 'Inactive: 3456260 kB' 'Active(anon): 6300492 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522424 kB' 'Mapped: 186976 kB' 'Shmem: 5781864 kB' 'KReclaimable: 206688 kB' 'Slab: 540416 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333728 kB' 'KernelStack: 16224 kB' 'PageTables: 8000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7717688 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.824 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.825 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:22.826 nr_hugepages=1024 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:22.826 resv_hugepages=0 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:22.826 surplus_hugepages=0 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:22.826 anon_hugepages=0 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78726040 kB' 'MemAvailable: 82025584 kB' 'Buffers: 12176 kB' 'Cached: 9619556 kB' 'SwapCached: 0 kB' 'Active: 6694080 kB' 'Inactive: 3456260 kB' 'Active(anon): 6300496 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521836 kB' 'Mapped: 186968 kB' 'Shmem: 5781888 kB' 'KReclaimable: 206688 kB' 'Slab: 540416 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333728 kB' 'KernelStack: 16336 kB' 'PageTables: 8324 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7717708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.826 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.827 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37579036 kB' 'MemUsed: 10537904 kB' 'SwapCached: 0 kB' 'Active: 5346152 kB' 'Inactive: 3372048 kB' 'Active(anon): 5188248 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8481484 kB' 'Mapped: 87580 kB' 'AnonPages: 239820 kB' 'Shmem: 4951532 kB' 'KernelStack: 9016 kB' 'PageTables: 4612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126424 kB' 'Slab: 338112 kB' 'SReclaimable: 126424 kB' 'SUnreclaim: 211688 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.828 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41146468 kB' 'MemUsed: 3030064 kB' 'SwapCached: 0 kB' 'Active: 1348276 kB' 'Inactive: 84212 kB' 'Active(anon): 1112596 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1150292 kB' 'Mapped: 99388 kB' 'AnonPages: 282320 kB' 'Shmem: 830400 kB' 'KernelStack: 7288 kB' 'PageTables: 4000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80264 kB' 'Slab: 202304 kB' 'SReclaimable: 80264 kB' 'SUnreclaim: 122040 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.829 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.830 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:22.831 node0=512 expecting 512 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:22.831 node1=512 expecting 512 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:22.831 00:04:22.831 real 0m3.914s 00:04:22.831 user 0m1.501s 00:04:22.831 sys 0m2.514s 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:22.831 23:59:09 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:22.831 ************************************ 00:04:22.831 END TEST per_node_1G_alloc 00:04:22.831 ************************************ 00:04:22.831 23:59:09 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:22.831 23:59:09 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:22.831 23:59:09 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:22.831 23:59:09 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:22.831 23:59:09 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:22.831 ************************************ 00:04:22.831 START TEST even_2G_alloc 00:04:22.831 ************************************ 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.831 23:59:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:26.123 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:26.123 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:26.123 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:26.123 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:26.123 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:26.385 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:26.385 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:26.385 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:26.385 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.385 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78706428 kB' 'MemAvailable: 82005972 kB' 'Buffers: 12176 kB' 'Cached: 9619668 kB' 'SwapCached: 0 kB' 'Active: 6696664 kB' 'Inactive: 3456260 kB' 'Active(anon): 6303080 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523836 kB' 'Mapped: 187120 kB' 'Shmem: 5782000 kB' 'KReclaimable: 206688 kB' 'Slab: 540156 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333468 kB' 'KernelStack: 16176 kB' 'PageTables: 8244 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7716576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.386 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78707212 kB' 'MemAvailable: 82006756 kB' 'Buffers: 12176 kB' 'Cached: 9619672 kB' 'SwapCached: 0 kB' 'Active: 6695452 kB' 'Inactive: 3456260 kB' 'Active(anon): 6301868 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523132 kB' 'Mapped: 186976 kB' 'Shmem: 5782004 kB' 'KReclaimable: 206688 kB' 'Slab: 540140 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333452 kB' 'KernelStack: 16096 kB' 'PageTables: 8000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7718084 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.387 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.388 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78705980 kB' 'MemAvailable: 82005524 kB' 'Buffers: 12176 kB' 'Cached: 9619672 kB' 'SwapCached: 0 kB' 'Active: 6695800 kB' 'Inactive: 3456260 kB' 'Active(anon): 6302216 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523548 kB' 'Mapped: 186976 kB' 'Shmem: 5782004 kB' 'KReclaimable: 206688 kB' 'Slab: 540140 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333452 kB' 'KernelStack: 16208 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7717944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201064 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.389 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.390 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:26.391 nr_hugepages=1024 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:26.391 resv_hugepages=0 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:26.391 surplus_hugepages=0 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:26.391 anon_hugepages=0 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78704564 kB' 'MemAvailable: 82004108 kB' 'Buffers: 12176 kB' 'Cached: 9619712 kB' 'SwapCached: 0 kB' 'Active: 6696064 kB' 'Inactive: 3456260 kB' 'Active(anon): 6302480 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523628 kB' 'Mapped: 186976 kB' 'Shmem: 5782044 kB' 'KReclaimable: 206688 kB' 'Slab: 540140 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333452 kB' 'KernelStack: 16304 kB' 'PageTables: 8432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7716636 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201144 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.391 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.663 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.663 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.663 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.663 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.663 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.663 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.663 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.664 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37566592 kB' 'MemUsed: 10550348 kB' 'SwapCached: 0 kB' 'Active: 5345836 kB' 'Inactive: 3372048 kB' 'Active(anon): 5187932 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8481488 kB' 'Mapped: 87588 kB' 'AnonPages: 239524 kB' 'Shmem: 4951536 kB' 'KernelStack: 9016 kB' 'PageTables: 4428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126424 kB' 'Slab: 337980 kB' 'SReclaimable: 126424 kB' 'SUnreclaim: 211556 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.665 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41140524 kB' 'MemUsed: 3036008 kB' 'SwapCached: 0 kB' 'Active: 1350352 kB' 'Inactive: 84212 kB' 'Active(anon): 1114672 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1150444 kB' 'Mapped: 99388 kB' 'AnonPages: 284184 kB' 'Shmem: 830552 kB' 'KernelStack: 7288 kB' 'PageTables: 4052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80264 kB' 'Slab: 202160 kB' 'SReclaimable: 80264 kB' 'SUnreclaim: 121896 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.666 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:26.667 node0=512 expecting 512 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:26.667 node1=512 expecting 512 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:26.667 00:04:26.667 real 0m3.981s 00:04:26.667 user 0m1.535s 00:04:26.667 sys 0m2.552s 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:26.667 23:59:13 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:26.667 ************************************ 00:04:26.667 END TEST even_2G_alloc 00:04:26.667 ************************************ 00:04:26.667 23:59:13 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:26.667 23:59:13 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:26.667 23:59:13 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:26.667 23:59:13 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:26.667 23:59:13 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:26.667 ************************************ 00:04:26.667 START TEST odd_alloc 00:04:26.667 ************************************ 00:04:26.667 23:59:13 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:26.667 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:26.667 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:26.667 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:26.668 23:59:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:30.865 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:30.865 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:30.865 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:30.865 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:30.865 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78698004 kB' 'MemAvailable: 81997548 kB' 'Buffers: 12176 kB' 'Cached: 9619824 kB' 'SwapCached: 0 kB' 'Active: 6700000 kB' 'Inactive: 3456260 kB' 'Active(anon): 6306416 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526996 kB' 'Mapped: 187640 kB' 'Shmem: 5782156 kB' 'KReclaimable: 206688 kB' 'Slab: 540396 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333708 kB' 'KernelStack: 16224 kB' 'PageTables: 8380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7720140 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.865 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78694044 kB' 'MemAvailable: 81993588 kB' 'Buffers: 12176 kB' 'Cached: 9619828 kB' 'SwapCached: 0 kB' 'Active: 6701648 kB' 'Inactive: 3456260 kB' 'Active(anon): 6308064 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529700 kB' 'Mapped: 187492 kB' 'Shmem: 5782160 kB' 'KReclaimable: 206688 kB' 'Slab: 540372 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333684 kB' 'KernelStack: 16192 kB' 'PageTables: 8272 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7725168 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200908 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.866 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.867 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78694660 kB' 'MemAvailable: 81994204 kB' 'Buffers: 12176 kB' 'Cached: 9619848 kB' 'SwapCached: 0 kB' 'Active: 6701940 kB' 'Inactive: 3456260 kB' 'Active(anon): 6308356 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529524 kB' 'Mapped: 187884 kB' 'Shmem: 5782180 kB' 'KReclaimable: 206688 kB' 'Slab: 540372 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333684 kB' 'KernelStack: 16320 kB' 'PageTables: 8284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7725180 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201004 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.868 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.869 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:30.870 nr_hugepages=1025 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:30.870 resv_hugepages=0 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:30.870 surplus_hugepages=0 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:30.870 anon_hugepages=0 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:30.870 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78694932 kB' 'MemAvailable: 81994476 kB' 'Buffers: 12176 kB' 'Cached: 9619848 kB' 'SwapCached: 0 kB' 'Active: 6696224 kB' 'Inactive: 3456260 kB' 'Active(anon): 6302640 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523804 kB' 'Mapped: 186980 kB' 'Shmem: 5782180 kB' 'KReclaimable: 206688 kB' 'Slab: 540372 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333684 kB' 'KernelStack: 16464 kB' 'PageTables: 8468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7717588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201032 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.871 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37560828 kB' 'MemUsed: 10556112 kB' 'SwapCached: 0 kB' 'Active: 5345840 kB' 'Inactive: 3372048 kB' 'Active(anon): 5187936 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8481488 kB' 'Mapped: 87600 kB' 'AnonPages: 239680 kB' 'Shmem: 4951536 kB' 'KernelStack: 9112 kB' 'PageTables: 4632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126424 kB' 'Slab: 338016 kB' 'SReclaimable: 126424 kB' 'SUnreclaim: 211592 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.872 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.873 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41136012 kB' 'MemUsed: 3040520 kB' 'SwapCached: 0 kB' 'Active: 1350944 kB' 'Inactive: 84212 kB' 'Active(anon): 1115264 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1150596 kB' 'Mapped: 99388 kB' 'AnonPages: 284632 kB' 'Shmem: 830704 kB' 'KernelStack: 7320 kB' 'PageTables: 4136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80264 kB' 'Slab: 202356 kB' 'SReclaimable: 80264 kB' 'SUnreclaim: 122092 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.874 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:30.875 node0=512 expecting 513 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:30.875 node1=513 expecting 512 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:30.875 00:04:30.875 real 0m3.964s 00:04:30.875 user 0m1.545s 00:04:30.875 sys 0m2.527s 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:30.875 23:59:17 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:30.875 ************************************ 00:04:30.875 END TEST odd_alloc 00:04:30.875 ************************************ 00:04:30.875 23:59:17 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:30.875 23:59:17 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:30.875 23:59:17 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:30.875 23:59:17 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:30.875 23:59:17 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:30.875 ************************************ 00:04:30.875 START TEST custom_alloc 00:04:30.875 ************************************ 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:30.875 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:30.876 23:59:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:34.168 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:34.168 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:34.168 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:34.168 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:34.168 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77662956 kB' 'MemAvailable: 80962500 kB' 'Buffers: 12176 kB' 'Cached: 9619972 kB' 'SwapCached: 0 kB' 'Active: 6697968 kB' 'Inactive: 3456260 kB' 'Active(anon): 6304384 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524864 kB' 'Mapped: 187092 kB' 'Shmem: 5782304 kB' 'KReclaimable: 206688 kB' 'Slab: 540208 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333520 kB' 'KernelStack: 16144 kB' 'PageTables: 8136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7716940 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.433 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.434 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77663652 kB' 'MemAvailable: 80963196 kB' 'Buffers: 12176 kB' 'Cached: 9619976 kB' 'SwapCached: 0 kB' 'Active: 6697140 kB' 'Inactive: 3456260 kB' 'Active(anon): 6303556 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524556 kB' 'Mapped: 187004 kB' 'Shmem: 5782308 kB' 'KReclaimable: 206688 kB' 'Slab: 540204 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333516 kB' 'KernelStack: 16144 kB' 'PageTables: 8128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7716956 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.435 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77662972 kB' 'MemAvailable: 80962516 kB' 'Buffers: 12176 kB' 'Cached: 9619992 kB' 'SwapCached: 0 kB' 'Active: 6697164 kB' 'Inactive: 3456260 kB' 'Active(anon): 6303580 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524560 kB' 'Mapped: 187004 kB' 'Shmem: 5782324 kB' 'KReclaimable: 206688 kB' 'Slab: 540204 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333516 kB' 'KernelStack: 16144 kB' 'PageTables: 8128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7727776 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.436 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.437 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:34.438 nr_hugepages=1536 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:34.438 resv_hugepages=0 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:34.438 surplus_hugepages=0 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:34.438 anon_hugepages=0 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77659444 kB' 'MemAvailable: 80958988 kB' 'Buffers: 12176 kB' 'Cached: 9620016 kB' 'SwapCached: 0 kB' 'Active: 6697440 kB' 'Inactive: 3456260 kB' 'Active(anon): 6303856 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524892 kB' 'Mapped: 187004 kB' 'Shmem: 5782348 kB' 'KReclaimable: 206688 kB' 'Slab: 540204 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333516 kB' 'KernelStack: 16144 kB' 'PageTables: 8104 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7717760 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.438 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.439 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37566704 kB' 'MemUsed: 10550236 kB' 'SwapCached: 0 kB' 'Active: 5345356 kB' 'Inactive: 3372048 kB' 'Active(anon): 5187452 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8481488 kB' 'Mapped: 87616 kB' 'AnonPages: 239188 kB' 'Shmem: 4951536 kB' 'KernelStack: 8904 kB' 'PageTables: 3924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126424 kB' 'Slab: 337996 kB' 'SReclaimable: 126424 kB' 'SUnreclaim: 211572 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.440 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.701 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:34.702 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 40093188 kB' 'MemUsed: 4083344 kB' 'SwapCached: 0 kB' 'Active: 1352056 kB' 'Inactive: 84212 kB' 'Active(anon): 1116376 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1150752 kB' 'Mapped: 99388 kB' 'AnonPages: 285492 kB' 'Shmem: 830860 kB' 'KernelStack: 7272 kB' 'PageTables: 3952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80264 kB' 'Slab: 202208 kB' 'SReclaimable: 80264 kB' 'SUnreclaim: 121944 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.703 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:34.704 node0=512 expecting 512 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:34.704 node1=1024 expecting 1024 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:34.704 00:04:34.704 real 0m3.892s 00:04:34.704 user 0m1.444s 00:04:34.704 sys 0m2.548s 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:34.704 23:59:21 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:34.704 ************************************ 00:04:34.704 END TEST custom_alloc 00:04:34.704 ************************************ 00:04:34.704 23:59:21 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:34.704 23:59:21 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:34.704 23:59:21 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:34.704 23:59:21 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:34.704 23:59:21 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:34.704 ************************************ 00:04:34.704 START TEST no_shrink_alloc 00:04:34.704 ************************************ 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:34.704 23:59:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:38.901 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:38.901 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:38.901 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:38.901 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:38.901 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78703260 kB' 'MemAvailable: 82002804 kB' 'Buffers: 12176 kB' 'Cached: 9620132 kB' 'SwapCached: 0 kB' 'Active: 6698636 kB' 'Inactive: 3456260 kB' 'Active(anon): 6305052 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525348 kB' 'Mapped: 187180 kB' 'Shmem: 5782464 kB' 'KReclaimable: 206688 kB' 'Slab: 540416 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333728 kB' 'KernelStack: 16160 kB' 'PageTables: 8204 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7717644 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.901 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78705320 kB' 'MemAvailable: 82004864 kB' 'Buffers: 12176 kB' 'Cached: 9620136 kB' 'SwapCached: 0 kB' 'Active: 6697888 kB' 'Inactive: 3456260 kB' 'Active(anon): 6304304 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525056 kB' 'Mapped: 187052 kB' 'Shmem: 5782468 kB' 'KReclaimable: 206688 kB' 'Slab: 540336 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333648 kB' 'KernelStack: 16144 kB' 'PageTables: 8136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7717664 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78705756 kB' 'MemAvailable: 82005300 kB' 'Buffers: 12176 kB' 'Cached: 9620152 kB' 'SwapCached: 0 kB' 'Active: 6697904 kB' 'Inactive: 3456260 kB' 'Active(anon): 6304320 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525060 kB' 'Mapped: 187052 kB' 'Shmem: 5782484 kB' 'KReclaimable: 206688 kB' 'Slab: 540336 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333648 kB' 'KernelStack: 16144 kB' 'PageTables: 8136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7717684 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.902 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:38.903 nr_hugepages=1024 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:38.903 resv_hugepages=0 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:38.903 surplus_hugepages=0 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:38.903 anon_hugepages=0 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78704748 kB' 'MemAvailable: 82004292 kB' 'Buffers: 12176 kB' 'Cached: 9620180 kB' 'SwapCached: 0 kB' 'Active: 6697984 kB' 'Inactive: 3456260 kB' 'Active(anon): 6304400 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525096 kB' 'Mapped: 187052 kB' 'Shmem: 5782512 kB' 'KReclaimable: 206688 kB' 'Slab: 540336 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333648 kB' 'KernelStack: 16160 kB' 'PageTables: 8188 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7717708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.903 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36520544 kB' 'MemUsed: 11596396 kB' 'SwapCached: 0 kB' 'Active: 5344656 kB' 'Inactive: 3372048 kB' 'Active(anon): 5186752 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8481488 kB' 'Mapped: 87664 kB' 'AnonPages: 238296 kB' 'Shmem: 4951536 kB' 'KernelStack: 8824 kB' 'PageTables: 3936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126424 kB' 'Slab: 338092 kB' 'SReclaimable: 126424 kB' 'SUnreclaim: 211668 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.904 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:38.905 node0=1024 expecting 1024 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:38.905 23:59:25 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:42.190 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:42.190 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:42.190 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:42.190 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:42.190 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:42.190 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78704620 kB' 'MemAvailable: 82004164 kB' 'Buffers: 12176 kB' 'Cached: 9620260 kB' 'SwapCached: 0 kB' 'Active: 6699152 kB' 'Inactive: 3456260 kB' 'Active(anon): 6305568 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525732 kB' 'Mapped: 187168 kB' 'Shmem: 5782592 kB' 'KReclaimable: 206688 kB' 'Slab: 540556 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333868 kB' 'KernelStack: 16144 kB' 'PageTables: 8156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7718160 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.495 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.496 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78705612 kB' 'MemAvailable: 82005156 kB' 'Buffers: 12176 kB' 'Cached: 9620260 kB' 'SwapCached: 0 kB' 'Active: 6698868 kB' 'Inactive: 3456260 kB' 'Active(anon): 6305284 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525976 kB' 'Mapped: 187524 kB' 'Shmem: 5782592 kB' 'KReclaimable: 206688 kB' 'Slab: 540528 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333840 kB' 'KernelStack: 16160 kB' 'PageTables: 8200 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7719268 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200888 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.497 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78701832 kB' 'MemAvailable: 82001376 kB' 'Buffers: 12176 kB' 'Cached: 9620280 kB' 'SwapCached: 0 kB' 'Active: 6702512 kB' 'Inactive: 3456260 kB' 'Active(anon): 6308928 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529048 kB' 'Mapped: 187524 kB' 'Shmem: 5782612 kB' 'KReclaimable: 206688 kB' 'Slab: 540528 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333840 kB' 'KernelStack: 16144 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7722988 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.498 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.499 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:42.500 nr_hugepages=1024 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:42.500 resv_hugepages=0 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:42.500 surplus_hugepages=0 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:42.500 anon_hugepages=0 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78698008 kB' 'MemAvailable: 81997552 kB' 'Buffers: 12176 kB' 'Cached: 9620280 kB' 'SwapCached: 0 kB' 'Active: 6698900 kB' 'Inactive: 3456260 kB' 'Active(anon): 6305316 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525984 kB' 'Mapped: 187432 kB' 'Shmem: 5782612 kB' 'KReclaimable: 206688 kB' 'Slab: 540528 kB' 'SReclaimable: 206688 kB' 'SUnreclaim: 333840 kB' 'KernelStack: 16176 kB' 'PageTables: 8304 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7753704 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1037732 kB' 'DirectMap2M: 18561024 kB' 'DirectMap1G: 81788928 kB' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.500 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.501 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36528840 kB' 'MemUsed: 11588100 kB' 'SwapCached: 0 kB' 'Active: 5345236 kB' 'Inactive: 3372048 kB' 'Active(anon): 5187332 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8481492 kB' 'Mapped: 87632 kB' 'AnonPages: 238992 kB' 'Shmem: 4951540 kB' 'KernelStack: 8872 kB' 'PageTables: 4128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126424 kB' 'Slab: 338228 kB' 'SReclaimable: 126424 kB' 'SUnreclaim: 211804 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.502 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:42.503 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:42.504 node0=1024 expecting 1024 00:04:42.504 23:59:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:42.504 00:04:42.504 real 0m7.828s 00:04:42.504 user 0m3.071s 00:04:42.504 sys 0m4.964s 00:04:42.504 23:59:29 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:42.504 23:59:29 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:42.504 ************************************ 00:04:42.504 END TEST no_shrink_alloc 00:04:42.504 ************************************ 00:04:42.504 23:59:29 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:42.504 23:59:29 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:42.504 00:04:42.504 real 0m30.954s 00:04:42.504 user 0m11.013s 00:04:42.504 sys 0m18.349s 00:04:42.504 23:59:29 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:42.504 23:59:29 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:42.504 ************************************ 00:04:42.504 END TEST hugepages 00:04:42.504 ************************************ 00:04:42.763 23:59:29 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:42.763 23:59:29 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:42.763 23:59:29 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:42.763 23:59:29 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:42.763 23:59:29 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:42.763 ************************************ 00:04:42.763 START TEST driver 00:04:42.763 ************************************ 00:04:42.763 23:59:29 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:42.763 * Looking for test storage... 00:04:42.763 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:42.763 23:59:29 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:42.763 23:59:29 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:42.763 23:59:29 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:48.043 23:59:34 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:48.043 23:59:34 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:48.043 23:59:34 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:48.043 23:59:34 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:48.043 ************************************ 00:04:48.043 START TEST guess_driver 00:04:48.043 ************************************ 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:48.043 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:48.043 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:48.043 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:48.043 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:48.043 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:48.043 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:48.043 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:48.043 Looking for driver=vfio-pci 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:48.043 23:59:34 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:52.234 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:52.234 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.235 23:59:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:54.773 23:59:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:54.773 23:59:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:54.773 23:59:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:54.773 23:59:41 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:54.773 23:59:41 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:54.773 23:59:41 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:54.773 23:59:41 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:00.048 00:05:00.048 real 0m11.639s 00:05:00.048 user 0m3.015s 00:05:00.048 sys 0m5.566s 00:05:00.048 23:59:46 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:00.048 23:59:46 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:00.048 ************************************ 00:05:00.048 END TEST guess_driver 00:05:00.048 ************************************ 00:05:00.048 23:59:46 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:05:00.048 00:05:00.048 real 0m17.039s 00:05:00.048 user 0m4.547s 00:05:00.048 sys 0m8.638s 00:05:00.048 23:59:46 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:00.048 23:59:46 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:00.048 ************************************ 00:05:00.048 END TEST driver 00:05:00.048 ************************************ 00:05:00.048 23:59:46 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:00.048 23:59:46 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:00.048 23:59:46 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:00.048 23:59:46 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:00.048 23:59:46 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:00.048 ************************************ 00:05:00.049 START TEST devices 00:05:00.049 ************************************ 00:05:00.049 23:59:46 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:00.049 * Looking for test storage... 00:05:00.049 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:00.049 23:59:46 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:00.049 23:59:46 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:00.049 23:59:46 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:00.049 23:59:46 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:04.243 23:59:50 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:04.243 23:59:50 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:04.243 No valid GPT data, bailing 00:05:04.243 23:59:50 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:04.243 23:59:50 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:04.243 23:59:50 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:04.243 23:59:50 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:04.243 23:59:50 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:04.243 23:59:50 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:04.243 23:59:50 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:04.243 23:59:50 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:04.243 ************************************ 00:05:04.243 START TEST nvme_mount 00:05:04.243 ************************************ 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:04.243 23:59:50 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:05.179 Creating new GPT entries in memory. 00:05:05.179 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:05.179 other utilities. 00:05:05.179 23:59:51 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:05.179 23:59:51 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:05.179 23:59:51 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:05.179 23:59:51 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:05.179 23:59:51 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:06.114 Creating new GPT entries in memory. 00:05:06.114 The operation has completed successfully. 00:05:06.114 23:59:52 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:06.114 23:59:52 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:06.114 23:59:52 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 3430401 00:05:06.114 23:59:52 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:06.114 23:59:52 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:06.114 23:59:52 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:06.114 23:59:52 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:06.114 23:59:52 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:06.114 23:59:53 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:06.114 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:06.114 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:06.114 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:06.114 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:06.114 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:06.114 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:06.114 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:06.114 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:06.114 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:06.373 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.374 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:06.374 23:59:53 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:06.374 23:59:53 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:06.374 23:59:53 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:09.664 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:09.664 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:10.020 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:10.020 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:10.020 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:10.020 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:10.020 23:59:56 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.215 00:00:00 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:17.498 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.498 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:17.498 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:17.498 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.498 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.498 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.498 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:17.499 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.758 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:17.758 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:17.758 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:17.758 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:17.758 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.758 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:17.758 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:17.758 00:00:04 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:17.758 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:17.758 00:05:17.758 real 0m13.578s 00:05:17.758 user 0m3.991s 00:05:17.758 sys 0m7.553s 00:05:17.758 00:00:04 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:17.758 00:00:04 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:17.758 ************************************ 00:05:17.758 END TEST nvme_mount 00:05:17.758 ************************************ 00:05:17.758 00:00:04 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:17.758 00:00:04 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:17.758 00:00:04 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:17.758 00:00:04 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:17.758 00:00:04 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:17.758 ************************************ 00:05:17.758 START TEST dm_mount 00:05:17.758 ************************************ 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:17.758 00:00:04 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:18.696 Creating new GPT entries in memory. 00:05:18.696 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:18.696 other utilities. 00:05:18.696 00:00:05 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:18.696 00:00:05 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:18.696 00:00:05 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:18.696 00:00:05 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:18.696 00:00:05 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:20.074 Creating new GPT entries in memory. 00:05:20.074 The operation has completed successfully. 00:05:20.074 00:00:06 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:20.074 00:00:06 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:20.074 00:00:06 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:20.074 00:00:06 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:20.074 00:00:06 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:21.010 The operation has completed successfully. 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 3434781 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:21.010 00:00:07 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:25.199 00:00:11 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:28.496 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:28.496 00:05:28.496 real 0m10.706s 00:05:28.496 user 0m2.830s 00:05:28.496 sys 0m5.001s 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:28.496 00:00:15 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:28.496 ************************************ 00:05:28.496 END TEST dm_mount 00:05:28.496 ************************************ 00:05:28.496 00:00:15 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:28.496 00:00:15 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:28.496 00:00:15 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:28.496 00:00:15 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:28.496 00:00:15 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:28.496 00:00:15 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:28.496 00:00:15 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:28.496 00:00:15 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:28.756 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:28.756 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:28.756 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:28.756 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:28.756 00:00:15 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:28.756 00:00:15 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:28.756 00:00:15 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:28.756 00:00:15 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:28.756 00:00:15 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:28.756 00:00:15 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:28.756 00:00:15 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:28.756 00:05:28.756 real 0m29.008s 00:05:28.756 user 0m8.434s 00:05:28.756 sys 0m15.607s 00:05:28.756 00:00:15 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:28.756 00:00:15 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:28.756 ************************************ 00:05:28.756 END TEST devices 00:05:28.756 ************************************ 00:05:28.756 00:00:15 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:28.756 00:05:28.756 real 1m45.602s 00:05:28.756 user 0m32.888s 00:05:28.756 sys 0m59.441s 00:05:28.756 00:00:15 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:28.756 00:00:15 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:28.756 ************************************ 00:05:28.756 END TEST setup.sh 00:05:28.756 ************************************ 00:05:28.756 00:00:15 -- common/autotest_common.sh@1142 -- # return 0 00:05:28.756 00:00:15 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:32.944 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:32.944 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:32.944 Hugepages 00:05:32.944 node hugesize free / total 00:05:32.944 node0 1048576kB 0 / 0 00:05:32.944 node0 2048kB 1024 / 1024 00:05:32.944 node1 1048576kB 0 / 0 00:05:32.944 node1 2048kB 1024 / 1024 00:05:32.944 00:05:32.944 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:32.944 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:32.944 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:32.944 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:32.944 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:32.944 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:32.944 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:32.944 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:32.944 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:32.944 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:05:32.944 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:32.944 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:32.944 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:32.944 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:32.944 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:32.944 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:32.944 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:32.944 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:32.944 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:05:32.944 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:05:32.944 00:00:19 -- spdk/autotest.sh@130 -- # uname -s 00:05:32.944 00:00:19 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:32.944 00:00:19 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:32.944 00:00:19 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:36.320 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:36.320 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:36.320 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:36.320 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:36.578 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:36.578 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:36.578 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:36.578 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:36.578 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:36.578 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:36.578 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:36.578 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:36.578 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:36.578 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:36.837 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:36.837 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:36.837 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:36.837 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:39.373 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:39.373 00:00:26 -- common/autotest_common.sh@1532 -- # sleep 1 00:05:40.307 00:00:27 -- common/autotest_common.sh@1533 -- # bdfs=() 00:05:40.307 00:00:27 -- common/autotest_common.sh@1533 -- # local bdfs 00:05:40.307 00:00:27 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:05:40.307 00:00:27 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:05:40.307 00:00:27 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:40.307 00:00:27 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:40.307 00:00:27 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:40.307 00:00:27 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:40.307 00:00:27 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:40.307 00:00:27 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:40.307 00:00:27 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:40.307 00:00:27 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:44.492 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:44.492 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:44.492 Waiting for block devices as requested 00:05:44.492 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:05:44.492 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:44.492 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:44.492 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:44.492 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:44.492 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:44.492 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:44.751 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:44.751 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:44.751 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:45.009 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:45.009 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:45.009 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:45.268 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:45.268 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:45.268 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:45.526 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:45.526 00:00:32 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:05:45.526 00:00:32 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:05:45.526 00:00:32 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:05:45.526 00:00:32 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:05:45.526 00:00:32 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:45.526 00:00:32 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:05:45.526 00:00:32 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:45.526 00:00:32 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:05:45.526 00:00:32 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:05:45.526 00:00:32 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:05:45.526 00:00:32 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:05:45.526 00:00:32 -- common/autotest_common.sh@1545 -- # grep oacs 00:05:45.526 00:00:32 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:05:45.526 00:00:32 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:05:45.526 00:00:32 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:05:45.526 00:00:32 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:05:45.526 00:00:32 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:05:45.526 00:00:32 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:05:45.526 00:00:32 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:05:45.526 00:00:32 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:05:45.526 00:00:32 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:05:45.526 00:00:32 -- common/autotest_common.sh@1557 -- # continue 00:05:45.526 00:00:32 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:45.526 00:00:32 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:45.526 00:00:32 -- common/autotest_common.sh@10 -- # set +x 00:05:45.526 00:00:32 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:45.526 00:00:32 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:45.526 00:00:32 -- common/autotest_common.sh@10 -- # set +x 00:05:45.526 00:00:32 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:49.712 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:49.712 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:49.712 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:49.712 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:52.246 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:52.246 00:00:38 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:52.246 00:00:38 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:52.246 00:00:38 -- common/autotest_common.sh@10 -- # set +x 00:05:52.246 00:00:38 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:52.246 00:00:38 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:05:52.246 00:00:38 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:05:52.246 00:00:38 -- common/autotest_common.sh@1577 -- # bdfs=() 00:05:52.246 00:00:38 -- common/autotest_common.sh@1577 -- # local bdfs 00:05:52.246 00:00:38 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:05:52.246 00:00:38 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:52.246 00:00:38 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:52.246 00:00:38 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:52.246 00:00:38 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:52.246 00:00:38 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:52.246 00:00:38 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:52.246 00:00:38 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:52.246 00:00:38 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:05:52.246 00:00:38 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:05:52.246 00:00:38 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:05:52.246 00:00:38 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:05:52.246 00:00:38 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:05:52.246 00:00:38 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:05:52.246 00:00:38 -- common/autotest_common.sh@1593 -- # return 0 00:05:52.246 00:00:38 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:52.246 00:00:38 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:52.246 00:00:38 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:52.246 00:00:38 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:52.246 00:00:38 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:53.183 Restarting all devices. 00:05:57.381 lstat() error: No such file or directory 00:05:57.381 QAT Error: No GENERAL section found 00:05:57.381 Failed to configure qat_dev0 00:05:57.381 lstat() error: No such file or directory 00:05:57.381 QAT Error: No GENERAL section found 00:05:57.381 Failed to configure qat_dev1 00:05:57.381 lstat() error: No such file or directory 00:05:57.381 QAT Error: No GENERAL section found 00:05:57.381 Failed to configure qat_dev2 00:05:57.381 enable sriov 00:05:57.381 Checking status of all devices. 00:05:57.381 There is 3 QAT acceleration device(s) in the system: 00:05:57.381 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:05:57.381 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:05:57.381 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:05:57.950 0000:3d:00.0 set to 16 VFs 00:05:59.377 0000:3f:00.0 set to 16 VFs 00:06:00.754 0000:da:00.0 set to 16 VFs 00:06:04.048 Properly configured the qat device with driver uio_pci_generic. 00:06:04.048 00:00:50 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:04.048 00:00:50 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:04.048 00:00:50 -- common/autotest_common.sh@10 -- # set +x 00:06:04.048 00:00:50 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:04.048 00:00:50 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:04.048 00:00:50 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:04.048 00:00:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:04.048 00:00:50 -- common/autotest_common.sh@10 -- # set +x 00:06:04.048 ************************************ 00:06:04.048 START TEST env 00:06:04.048 ************************************ 00:06:04.048 00:00:50 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:04.048 * Looking for test storage... 00:06:04.048 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:04.048 00:00:50 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:04.048 00:00:50 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:04.048 00:00:50 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:04.048 00:00:50 env -- common/autotest_common.sh@10 -- # set +x 00:06:04.048 ************************************ 00:06:04.048 START TEST env_memory 00:06:04.048 ************************************ 00:06:04.048 00:00:50 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:04.048 00:06:04.048 00:06:04.048 CUnit - A unit testing framework for C - Version 2.1-3 00:06:04.048 http://cunit.sourceforge.net/ 00:06:04.048 00:06:04.048 00:06:04.048 Suite: memory 00:06:04.048 Test: alloc and free memory map ...[2024-07-16 00:00:50.959822] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:04.048 passed 00:06:04.048 Test: mem map translation ...[2024-07-16 00:00:50.991199] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:04.048 [2024-07-16 00:00:50.991224] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:04.048 [2024-07-16 00:00:50.991280] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:04.048 [2024-07-16 00:00:50.991294] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:04.308 passed 00:06:04.308 Test: mem map registration ...[2024-07-16 00:00:51.053848] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:04.308 [2024-07-16 00:00:51.053870] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:04.308 passed 00:06:04.308 Test: mem map adjacent registrations ...passed 00:06:04.308 00:06:04.308 Run Summary: Type Total Ran Passed Failed Inactive 00:06:04.308 suites 1 1 n/a 0 0 00:06:04.308 tests 4 4 4 0 0 00:06:04.308 asserts 152 152 152 0 n/a 00:06:04.308 00:06:04.308 Elapsed time = 0.209 seconds 00:06:04.308 00:06:04.308 real 0m0.223s 00:06:04.308 user 0m0.209s 00:06:04.308 sys 0m0.013s 00:06:04.308 00:00:51 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:04.308 00:00:51 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:04.308 ************************************ 00:06:04.308 END TEST env_memory 00:06:04.308 ************************************ 00:06:04.308 00:00:51 env -- common/autotest_common.sh@1142 -- # return 0 00:06:04.308 00:00:51 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:04.308 00:00:51 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:04.308 00:00:51 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:04.308 00:00:51 env -- common/autotest_common.sh@10 -- # set +x 00:06:04.308 ************************************ 00:06:04.308 START TEST env_vtophys 00:06:04.308 ************************************ 00:06:04.308 00:00:51 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:04.308 EAL: lib.eal log level changed from notice to debug 00:06:04.308 EAL: Detected lcore 0 as core 0 on socket 0 00:06:04.308 EAL: Detected lcore 1 as core 1 on socket 0 00:06:04.308 EAL: Detected lcore 2 as core 2 on socket 0 00:06:04.308 EAL: Detected lcore 3 as core 3 on socket 0 00:06:04.308 EAL: Detected lcore 4 as core 4 on socket 0 00:06:04.308 EAL: Detected lcore 5 as core 8 on socket 0 00:06:04.308 EAL: Detected lcore 6 as core 9 on socket 0 00:06:04.308 EAL: Detected lcore 7 as core 10 on socket 0 00:06:04.308 EAL: Detected lcore 8 as core 11 on socket 0 00:06:04.308 EAL: Detected lcore 9 as core 16 on socket 0 00:06:04.308 EAL: Detected lcore 10 as core 17 on socket 0 00:06:04.308 EAL: Detected lcore 11 as core 18 on socket 0 00:06:04.308 EAL: Detected lcore 12 as core 19 on socket 0 00:06:04.308 EAL: Detected lcore 13 as core 20 on socket 0 00:06:04.308 EAL: Detected lcore 14 as core 24 on socket 0 00:06:04.308 EAL: Detected lcore 15 as core 25 on socket 0 00:06:04.308 EAL: Detected lcore 16 as core 26 on socket 0 00:06:04.308 EAL: Detected lcore 17 as core 27 on socket 0 00:06:04.308 EAL: Detected lcore 18 as core 0 on socket 1 00:06:04.308 EAL: Detected lcore 19 as core 1 on socket 1 00:06:04.308 EAL: Detected lcore 20 as core 2 on socket 1 00:06:04.308 EAL: Detected lcore 21 as core 3 on socket 1 00:06:04.308 EAL: Detected lcore 22 as core 4 on socket 1 00:06:04.308 EAL: Detected lcore 23 as core 8 on socket 1 00:06:04.308 EAL: Detected lcore 24 as core 9 on socket 1 00:06:04.308 EAL: Detected lcore 25 as core 10 on socket 1 00:06:04.308 EAL: Detected lcore 26 as core 11 on socket 1 00:06:04.308 EAL: Detected lcore 27 as core 16 on socket 1 00:06:04.308 EAL: Detected lcore 28 as core 17 on socket 1 00:06:04.308 EAL: Detected lcore 29 as core 18 on socket 1 00:06:04.308 EAL: Detected lcore 30 as core 19 on socket 1 00:06:04.308 EAL: Detected lcore 31 as core 20 on socket 1 00:06:04.308 EAL: Detected lcore 32 as core 24 on socket 1 00:06:04.308 EAL: Detected lcore 33 as core 25 on socket 1 00:06:04.308 EAL: Detected lcore 34 as core 26 on socket 1 00:06:04.308 EAL: Detected lcore 35 as core 27 on socket 1 00:06:04.308 EAL: Detected lcore 36 as core 0 on socket 0 00:06:04.308 EAL: Detected lcore 37 as core 1 on socket 0 00:06:04.308 EAL: Detected lcore 38 as core 2 on socket 0 00:06:04.308 EAL: Detected lcore 39 as core 3 on socket 0 00:06:04.308 EAL: Detected lcore 40 as core 4 on socket 0 00:06:04.308 EAL: Detected lcore 41 as core 8 on socket 0 00:06:04.308 EAL: Detected lcore 42 as core 9 on socket 0 00:06:04.308 EAL: Detected lcore 43 as core 10 on socket 0 00:06:04.308 EAL: Detected lcore 44 as core 11 on socket 0 00:06:04.308 EAL: Detected lcore 45 as core 16 on socket 0 00:06:04.308 EAL: Detected lcore 46 as core 17 on socket 0 00:06:04.308 EAL: Detected lcore 47 as core 18 on socket 0 00:06:04.308 EAL: Detected lcore 48 as core 19 on socket 0 00:06:04.308 EAL: Detected lcore 49 as core 20 on socket 0 00:06:04.308 EAL: Detected lcore 50 as core 24 on socket 0 00:06:04.308 EAL: Detected lcore 51 as core 25 on socket 0 00:06:04.308 EAL: Detected lcore 52 as core 26 on socket 0 00:06:04.308 EAL: Detected lcore 53 as core 27 on socket 0 00:06:04.308 EAL: Detected lcore 54 as core 0 on socket 1 00:06:04.308 EAL: Detected lcore 55 as core 1 on socket 1 00:06:04.308 EAL: Detected lcore 56 as core 2 on socket 1 00:06:04.308 EAL: Detected lcore 57 as core 3 on socket 1 00:06:04.308 EAL: Detected lcore 58 as core 4 on socket 1 00:06:04.308 EAL: Detected lcore 59 as core 8 on socket 1 00:06:04.308 EAL: Detected lcore 60 as core 9 on socket 1 00:06:04.308 EAL: Detected lcore 61 as core 10 on socket 1 00:06:04.308 EAL: Detected lcore 62 as core 11 on socket 1 00:06:04.308 EAL: Detected lcore 63 as core 16 on socket 1 00:06:04.308 EAL: Detected lcore 64 as core 17 on socket 1 00:06:04.308 EAL: Detected lcore 65 as core 18 on socket 1 00:06:04.308 EAL: Detected lcore 66 as core 19 on socket 1 00:06:04.308 EAL: Detected lcore 67 as core 20 on socket 1 00:06:04.308 EAL: Detected lcore 68 as core 24 on socket 1 00:06:04.308 EAL: Detected lcore 69 as core 25 on socket 1 00:06:04.308 EAL: Detected lcore 70 as core 26 on socket 1 00:06:04.308 EAL: Detected lcore 71 as core 27 on socket 1 00:06:04.308 EAL: Maximum logical cores by configuration: 128 00:06:04.308 EAL: Detected CPU lcores: 72 00:06:04.308 EAL: Detected NUMA nodes: 2 00:06:04.308 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:04.308 EAL: Detected shared linkage of DPDK 00:06:04.308 EAL: No shared files mode enabled, IPC will be disabled 00:06:04.569 EAL: No shared files mode enabled, IPC is disabled 00:06:04.569 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:06:04.569 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:06:04.569 EAL: Bus pci wants IOVA as 'PA' 00:06:04.569 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:04.569 EAL: Bus vdev wants IOVA as 'DC' 00:06:04.569 EAL: Selected IOVA mode 'PA' 00:06:04.569 EAL: Probing VFIO support... 00:06:04.569 EAL: IOMMU type 1 (Type 1) is supported 00:06:04.569 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:04.569 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:04.569 EAL: VFIO support initialized 00:06:04.569 EAL: Ask a virtual area of 0x2e000 bytes 00:06:04.569 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:04.569 EAL: Setting up physically contiguous memory... 00:06:04.569 EAL: Setting maximum number of open files to 524288 00:06:04.569 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:04.569 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:04.569 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:04.569 EAL: Ask a virtual area of 0x61000 bytes 00:06:04.569 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:04.569 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:04.569 EAL: Ask a virtual area of 0x400000000 bytes 00:06:04.569 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:04.569 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:04.569 EAL: Ask a virtual area of 0x61000 bytes 00:06:04.569 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:04.569 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:04.569 EAL: Ask a virtual area of 0x400000000 bytes 00:06:04.569 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:04.569 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:04.569 EAL: Ask a virtual area of 0x61000 bytes 00:06:04.569 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:04.569 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:04.569 EAL: Ask a virtual area of 0x400000000 bytes 00:06:04.569 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:04.569 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:04.569 EAL: Ask a virtual area of 0x61000 bytes 00:06:04.569 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:04.569 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:04.569 EAL: Ask a virtual area of 0x400000000 bytes 00:06:04.569 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:04.569 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:04.569 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:04.569 EAL: Ask a virtual area of 0x61000 bytes 00:06:04.569 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:04.569 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:04.569 EAL: Ask a virtual area of 0x400000000 bytes 00:06:04.569 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:04.569 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:04.569 EAL: Ask a virtual area of 0x61000 bytes 00:06:04.569 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:04.569 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:04.569 EAL: Ask a virtual area of 0x400000000 bytes 00:06:04.570 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:04.570 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:04.570 EAL: Ask a virtual area of 0x61000 bytes 00:06:04.570 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:04.570 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:04.570 EAL: Ask a virtual area of 0x400000000 bytes 00:06:04.570 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:04.570 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:04.570 EAL: Ask a virtual area of 0x61000 bytes 00:06:04.570 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:04.570 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:04.570 EAL: Ask a virtual area of 0x400000000 bytes 00:06:04.570 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:04.570 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:04.570 EAL: Hugepages will be freed exactly as allocated. 00:06:04.570 EAL: No shared files mode enabled, IPC is disabled 00:06:04.570 EAL: No shared files mode enabled, IPC is disabled 00:06:04.570 EAL: TSC frequency is ~2300000 KHz 00:06:04.570 EAL: Main lcore 0 is ready (tid=7fce40be6b00;cpuset=[0]) 00:06:04.570 EAL: Trying to obtain current memory policy. 00:06:04.570 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:04.570 EAL: Restoring previous memory policy: 0 00:06:04.570 EAL: request: mp_malloc_sync 00:06:04.570 EAL: No shared files mode enabled, IPC is disabled 00:06:04.570 EAL: Heap on socket 0 was expanded by 2MB 00:06:04.570 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001000000 00:06:04.570 EAL: PCI memory mapped at 0x202001001000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001002000 00:06:04.570 EAL: PCI memory mapped at 0x202001003000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001004000 00:06:04.570 EAL: PCI memory mapped at 0x202001005000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001006000 00:06:04.570 EAL: PCI memory mapped at 0x202001007000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001008000 00:06:04.570 EAL: PCI memory mapped at 0x202001009000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200100a000 00:06:04.570 EAL: PCI memory mapped at 0x20200100b000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200100c000 00:06:04.570 EAL: PCI memory mapped at 0x20200100d000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200100e000 00:06:04.570 EAL: PCI memory mapped at 0x20200100f000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001010000 00:06:04.570 EAL: PCI memory mapped at 0x202001011000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001012000 00:06:04.570 EAL: PCI memory mapped at 0x202001013000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001014000 00:06:04.570 EAL: PCI memory mapped at 0x202001015000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001016000 00:06:04.570 EAL: PCI memory mapped at 0x202001017000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001018000 00:06:04.570 EAL: PCI memory mapped at 0x202001019000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200101a000 00:06:04.570 EAL: PCI memory mapped at 0x20200101b000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200101c000 00:06:04.570 EAL: PCI memory mapped at 0x20200101d000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:04.570 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200101e000 00:06:04.570 EAL: PCI memory mapped at 0x20200101f000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001020000 00:06:04.570 EAL: PCI memory mapped at 0x202001021000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001022000 00:06:04.570 EAL: PCI memory mapped at 0x202001023000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001024000 00:06:04.570 EAL: PCI memory mapped at 0x202001025000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001026000 00:06:04.570 EAL: PCI memory mapped at 0x202001027000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001028000 00:06:04.570 EAL: PCI memory mapped at 0x202001029000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200102a000 00:06:04.570 EAL: PCI memory mapped at 0x20200102b000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200102c000 00:06:04.570 EAL: PCI memory mapped at 0x20200102d000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200102e000 00:06:04.570 EAL: PCI memory mapped at 0x20200102f000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001030000 00:06:04.570 EAL: PCI memory mapped at 0x202001031000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001032000 00:06:04.570 EAL: PCI memory mapped at 0x202001033000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001034000 00:06:04.570 EAL: PCI memory mapped at 0x202001035000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001036000 00:06:04.570 EAL: PCI memory mapped at 0x202001037000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001038000 00:06:04.570 EAL: PCI memory mapped at 0x202001039000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200103a000 00:06:04.570 EAL: PCI memory mapped at 0x20200103b000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200103c000 00:06:04.570 EAL: PCI memory mapped at 0x20200103d000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:04.570 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x20200103e000 00:06:04.570 EAL: PCI memory mapped at 0x20200103f000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:04.570 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:06:04.570 EAL: probe driver: 8086:37c9 qat 00:06:04.570 EAL: PCI memory mapped at 0x202001040000 00:06:04.570 EAL: PCI memory mapped at 0x202001041000 00:06:04.570 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:04.570 EAL: Trying to obtain current memory policy. 00:06:04.570 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:04.570 EAL: Restoring previous memory policy: 4 00:06:04.570 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 1 was expanded by 2MB 00:06:04.571 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x202001042000 00:06:04.571 EAL: PCI memory mapped at 0x202001043000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x202001044000 00:06:04.571 EAL: PCI memory mapped at 0x202001045000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x202001046000 00:06:04.571 EAL: PCI memory mapped at 0x202001047000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x202001048000 00:06:04.571 EAL: PCI memory mapped at 0x202001049000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x20200104a000 00:06:04.571 EAL: PCI memory mapped at 0x20200104b000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x20200104c000 00:06:04.571 EAL: PCI memory mapped at 0x20200104d000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x20200104e000 00:06:04.571 EAL: PCI memory mapped at 0x20200104f000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x202001050000 00:06:04.571 EAL: PCI memory mapped at 0x202001051000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x202001052000 00:06:04.571 EAL: PCI memory mapped at 0x202001053000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x202001054000 00:06:04.571 EAL: PCI memory mapped at 0x202001055000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x202001056000 00:06:04.571 EAL: PCI memory mapped at 0x202001057000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x202001058000 00:06:04.571 EAL: PCI memory mapped at 0x202001059000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x20200105a000 00:06:04.571 EAL: PCI memory mapped at 0x20200105b000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x20200105c000 00:06:04.571 EAL: PCI memory mapped at 0x20200105d000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:04.571 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:06:04.571 EAL: probe driver: 8086:37c9 qat 00:06:04.571 EAL: PCI memory mapped at 0x20200105e000 00:06:04.571 EAL: PCI memory mapped at 0x20200105f000 00:06:04.571 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:04.571 EAL: Mem event callback 'spdk:(nil)' registered 00:06:04.571 00:06:04.571 00:06:04.571 CUnit - A unit testing framework for C - Version 2.1-3 00:06:04.571 http://cunit.sourceforge.net/ 00:06:04.571 00:06:04.571 00:06:04.571 Suite: components_suite 00:06:04.571 Test: vtophys_malloc_test ...passed 00:06:04.571 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:04.571 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:04.571 EAL: Restoring previous memory policy: 4 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was expanded by 4MB 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was shrunk by 4MB 00:06:04.571 EAL: Trying to obtain current memory policy. 00:06:04.571 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:04.571 EAL: Restoring previous memory policy: 4 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was expanded by 6MB 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was shrunk by 6MB 00:06:04.571 EAL: Trying to obtain current memory policy. 00:06:04.571 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:04.571 EAL: Restoring previous memory policy: 4 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was expanded by 10MB 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was shrunk by 10MB 00:06:04.571 EAL: Trying to obtain current memory policy. 00:06:04.571 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:04.571 EAL: Restoring previous memory policy: 4 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was expanded by 18MB 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was shrunk by 18MB 00:06:04.571 EAL: Trying to obtain current memory policy. 00:06:04.571 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:04.571 EAL: Restoring previous memory policy: 4 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was expanded by 34MB 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was shrunk by 34MB 00:06:04.571 EAL: Trying to obtain current memory policy. 00:06:04.571 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:04.571 EAL: Restoring previous memory policy: 4 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was expanded by 66MB 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was shrunk by 66MB 00:06:04.571 EAL: Trying to obtain current memory policy. 00:06:04.571 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:04.571 EAL: Restoring previous memory policy: 4 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was expanded by 130MB 00:06:04.571 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.571 EAL: request: mp_malloc_sync 00:06:04.571 EAL: No shared files mode enabled, IPC is disabled 00:06:04.571 EAL: Heap on socket 0 was shrunk by 130MB 00:06:04.571 EAL: Trying to obtain current memory policy. 00:06:04.571 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:04.831 EAL: Restoring previous memory policy: 4 00:06:04.831 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.831 EAL: request: mp_malloc_sync 00:06:04.831 EAL: No shared files mode enabled, IPC is disabled 00:06:04.831 EAL: Heap on socket 0 was expanded by 258MB 00:06:04.831 EAL: Calling mem event callback 'spdk:(nil)' 00:06:04.831 EAL: request: mp_malloc_sync 00:06:04.831 EAL: No shared files mode enabled, IPC is disabled 00:06:04.831 EAL: Heap on socket 0 was shrunk by 258MB 00:06:04.831 EAL: Trying to obtain current memory policy. 00:06:04.831 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:05.090 EAL: Restoring previous memory policy: 4 00:06:05.090 EAL: Calling mem event callback 'spdk:(nil)' 00:06:05.090 EAL: request: mp_malloc_sync 00:06:05.090 EAL: No shared files mode enabled, IPC is disabled 00:06:05.090 EAL: Heap on socket 0 was expanded by 514MB 00:06:05.090 EAL: Calling mem event callback 'spdk:(nil)' 00:06:05.090 EAL: request: mp_malloc_sync 00:06:05.090 EAL: No shared files mode enabled, IPC is disabled 00:06:05.090 EAL: Heap on socket 0 was shrunk by 514MB 00:06:05.090 EAL: Trying to obtain current memory policy. 00:06:05.090 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:05.347 EAL: Restoring previous memory policy: 4 00:06:05.347 EAL: Calling mem event callback 'spdk:(nil)' 00:06:05.347 EAL: request: mp_malloc_sync 00:06:05.347 EAL: No shared files mode enabled, IPC is disabled 00:06:05.347 EAL: Heap on socket 0 was expanded by 1026MB 00:06:05.605 EAL: Calling mem event callback 'spdk:(nil)' 00:06:05.864 EAL: request: mp_malloc_sync 00:06:05.865 EAL: No shared files mode enabled, IPC is disabled 00:06:05.865 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:05.865 passed 00:06:05.865 00:06:05.865 Run Summary: Type Total Ran Passed Failed Inactive 00:06:05.865 suites 1 1 n/a 0 0 00:06:05.865 tests 2 2 2 0 0 00:06:05.865 asserts 5673 5673 5673 0 n/a 00:06:05.865 00:06:05.865 Elapsed time = 1.180 seconds 00:06:05.865 EAL: No shared files mode enabled, IPC is disabled 00:06:05.865 EAL: No shared files mode enabled, IPC is disabled 00:06:05.865 EAL: No shared files mode enabled, IPC is disabled 00:06:05.865 00:06:05.865 real 0m1.382s 00:06:05.865 user 0m0.774s 00:06:05.865 sys 0m0.576s 00:06:05.865 00:00:52 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:05.865 00:00:52 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:05.865 ************************************ 00:06:05.865 END TEST env_vtophys 00:06:05.865 ************************************ 00:06:05.865 00:00:52 env -- common/autotest_common.sh@1142 -- # return 0 00:06:05.865 00:00:52 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:05.865 00:00:52 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:05.865 00:00:52 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:05.865 00:00:52 env -- common/autotest_common.sh@10 -- # set +x 00:06:05.865 ************************************ 00:06:05.865 START TEST env_pci 00:06:05.865 ************************************ 00:06:05.865 00:00:52 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:05.865 00:06:05.865 00:06:05.865 CUnit - A unit testing framework for C - Version 2.1-3 00:06:05.865 http://cunit.sourceforge.net/ 00:06:05.865 00:06:05.865 00:06:05.865 Suite: pci 00:06:05.865 Test: pci_hook ...[2024-07-16 00:00:52.693690] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 3446411 has claimed it 00:06:05.865 EAL: Cannot find device (10000:00:01.0) 00:06:05.865 EAL: Failed to attach device on primary process 00:06:05.865 passed 00:06:05.865 00:06:05.865 Run Summary: Type Total Ran Passed Failed Inactive 00:06:05.865 suites 1 1 n/a 0 0 00:06:05.865 tests 1 1 1 0 0 00:06:05.865 asserts 25 25 25 0 n/a 00:06:05.865 00:06:05.865 Elapsed time = 0.035 seconds 00:06:05.865 00:06:05.865 real 0m0.059s 00:06:05.865 user 0m0.013s 00:06:05.865 sys 0m0.046s 00:06:05.865 00:00:52 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:05.865 00:00:52 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:05.865 ************************************ 00:06:05.865 END TEST env_pci 00:06:05.865 ************************************ 00:06:05.865 00:00:52 env -- common/autotest_common.sh@1142 -- # return 0 00:06:05.865 00:00:52 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:05.865 00:00:52 env -- env/env.sh@15 -- # uname 00:06:05.865 00:00:52 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:05.865 00:00:52 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:05.865 00:00:52 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:05.865 00:00:52 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:06:05.865 00:00:52 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:05.865 00:00:52 env -- common/autotest_common.sh@10 -- # set +x 00:06:06.125 ************************************ 00:06:06.125 START TEST env_dpdk_post_init 00:06:06.125 ************************************ 00:06:06.125 00:00:52 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:06.125 EAL: Detected CPU lcores: 72 00:06:06.125 EAL: Detected NUMA nodes: 2 00:06:06.125 EAL: Detected shared linkage of DPDK 00:06:06.125 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:06.125 EAL: Selected IOVA mode 'PA' 00:06:06.125 EAL: VFIO support initialized 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:06.125 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.125 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:06.125 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.126 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.126 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:06.126 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.127 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.127 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.127 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.127 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.127 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.127 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.127 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.127 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:06.127 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:06.127 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:06.127 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:06.127 EAL: Using IOMMU type 1 (Type 1) 00:06:06.127 EAL: Ignore mapping IO port bar(1) 00:06:06.127 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:06:06.127 EAL: Ignore mapping IO port bar(1) 00:06:06.127 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:06:06.386 EAL: Ignore mapping IO port bar(1) 00:06:06.386 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:06:06.386 EAL: Ignore mapping IO port bar(1) 00:06:06.386 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:06:06.386 EAL: Ignore mapping IO port bar(1) 00:06:06.386 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:06:06.386 EAL: Ignore mapping IO port bar(1) 00:06:06.386 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:06:06.386 EAL: Ignore mapping IO port bar(1) 00:06:06.386 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:06:06.386 EAL: Ignore mapping IO port bar(1) 00:06:06.386 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:06:06.644 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:06:06.644 EAL: Ignore mapping IO port bar(1) 00:06:06.644 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:06:06.644 EAL: Ignore mapping IO port bar(1) 00:06:06.644 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:06:06.644 EAL: Ignore mapping IO port bar(1) 00:06:06.644 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:06:06.644 EAL: Ignore mapping IO port bar(1) 00:06:06.644 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:06:06.644 EAL: Ignore mapping IO port bar(1) 00:06:06.644 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:06:06.644 EAL: Ignore mapping IO port bar(1) 00:06:06.644 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:06:06.644 EAL: Ignore mapping IO port bar(1) 00:06:06.644 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:06:06.644 EAL: Ignore mapping IO port bar(1) 00:06:06.644 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:06:06.644 EAL: Ignore mapping IO port bar(1) 00:06:06.644 EAL: Ignore mapping IO port bar(5) 00:06:06.644 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:06:06.644 EAL: Ignore mapping IO port bar(1) 00:06:06.644 EAL: Ignore mapping IO port bar(5) 00:06:06.644 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:06:09.955 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:06:09.955 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:06:09.955 Starting DPDK initialization... 00:06:09.955 Starting SPDK post initialization... 00:06:09.955 SPDK NVMe probe 00:06:09.955 Attaching to 0000:5e:00.0 00:06:09.955 Attached to 0000:5e:00.0 00:06:09.955 Cleaning up... 00:06:09.955 00:06:09.955 real 0m3.549s 00:06:09.955 user 0m2.462s 00:06:09.955 sys 0m0.642s 00:06:09.955 00:00:56 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:09.955 00:00:56 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:09.955 ************************************ 00:06:09.955 END TEST env_dpdk_post_init 00:06:09.955 ************************************ 00:06:09.955 00:00:56 env -- common/autotest_common.sh@1142 -- # return 0 00:06:09.955 00:00:56 env -- env/env.sh@26 -- # uname 00:06:09.955 00:00:56 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:09.955 00:00:56 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:09.955 00:00:56 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:09.955 00:00:56 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:09.955 00:00:56 env -- common/autotest_common.sh@10 -- # set +x 00:06:09.955 ************************************ 00:06:09.955 START TEST env_mem_callbacks 00:06:09.955 ************************************ 00:06:09.955 00:00:56 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:09.955 EAL: Detected CPU lcores: 72 00:06:09.955 EAL: Detected NUMA nodes: 2 00:06:09.955 EAL: Detected shared linkage of DPDK 00:06:09.955 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:09.955 EAL: Selected IOVA mode 'PA' 00:06:09.955 EAL: VFIO support initialized 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.955 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:09.955 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.955 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.956 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:09.956 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.956 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:09.956 00:06:09.956 00:06:09.956 CUnit - A unit testing framework for C - Version 2.1-3 00:06:09.956 http://cunit.sourceforge.net/ 00:06:09.956 00:06:09.956 00:06:09.956 Suite: memory 00:06:09.956 Test: test ... 00:06:09.956 register 0x200000200000 2097152 00:06:09.956 register 0x201000a00000 2097152 00:06:09.956 malloc 3145728 00:06:09.956 register 0x200000400000 4194304 00:06:09.956 buf 0x200000500000 len 3145728 PASSED 00:06:09.956 malloc 64 00:06:09.956 buf 0x2000004fff40 len 64 PASSED 00:06:09.956 malloc 4194304 00:06:09.956 register 0x200000800000 6291456 00:06:09.956 buf 0x200000a00000 len 4194304 PASSED 00:06:09.956 free 0x200000500000 3145728 00:06:09.956 free 0x2000004fff40 64 00:06:09.956 unregister 0x200000400000 4194304 PASSED 00:06:09.956 free 0x200000a00000 4194304 00:06:09.956 unregister 0x200000800000 6291456 PASSED 00:06:09.956 malloc 8388608 00:06:09.956 register 0x200000400000 10485760 00:06:09.956 buf 0x200000600000 len 8388608 PASSED 00:06:09.956 free 0x200000600000 8388608 00:06:09.956 unregister 0x200000400000 10485760 PASSED 00:06:09.956 passed 00:06:09.956 00:06:09.956 Run Summary: Type Total Ran Passed Failed Inactive 00:06:09.956 suites 1 1 n/a 0 0 00:06:09.957 tests 1 1 1 0 0 00:06:09.957 asserts 16 16 16 0 n/a 00:06:09.957 00:06:09.957 Elapsed time = 0.007 seconds 00:06:09.957 00:06:09.957 real 0m0.111s 00:06:09.957 user 0m0.034s 00:06:09.957 sys 0m0.076s 00:06:09.957 00:00:56 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:09.957 00:00:56 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:09.957 ************************************ 00:06:09.957 END TEST env_mem_callbacks 00:06:09.957 ************************************ 00:06:09.957 00:00:56 env -- common/autotest_common.sh@1142 -- # return 0 00:06:09.957 00:06:09.957 real 0m5.856s 00:06:09.957 user 0m3.679s 00:06:09.957 sys 0m1.741s 00:06:09.957 00:00:56 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:09.957 00:00:56 env -- common/autotest_common.sh@10 -- # set +x 00:06:09.957 ************************************ 00:06:09.957 END TEST env 00:06:09.957 ************************************ 00:06:09.957 00:00:56 -- common/autotest_common.sh@1142 -- # return 0 00:06:09.957 00:00:56 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:09.957 00:00:56 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:09.957 00:00:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:09.957 00:00:56 -- common/autotest_common.sh@10 -- # set +x 00:06:09.957 ************************************ 00:06:09.957 START TEST rpc 00:06:09.957 ************************************ 00:06:09.957 00:00:56 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:09.957 * Looking for test storage... 00:06:09.957 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:09.957 00:00:56 rpc -- rpc/rpc.sh@65 -- # spdk_pid=3447063 00:06:09.957 00:00:56 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:09.957 00:00:56 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:09.957 00:00:56 rpc -- rpc/rpc.sh@67 -- # waitforlisten 3447063 00:06:09.957 00:00:56 rpc -- common/autotest_common.sh@829 -- # '[' -z 3447063 ']' 00:06:09.957 00:00:56 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:09.957 00:00:56 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:09.957 00:00:56 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:09.957 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:09.957 00:00:56 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:09.957 00:00:56 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:09.957 [2024-07-16 00:00:56.881485] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:06:09.957 [2024-07-16 00:00:56.881557] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3447063 ] 00:06:10.216 [2024-07-16 00:00:57.004622] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.216 [2024-07-16 00:00:57.102220] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:10.216 [2024-07-16 00:00:57.102275] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 3447063' to capture a snapshot of events at runtime. 00:06:10.216 [2024-07-16 00:00:57.102289] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:10.216 [2024-07-16 00:00:57.102302] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:10.216 [2024-07-16 00:00:57.102313] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid3447063 for offline analysis/debug. 00:06:10.216 [2024-07-16 00:00:57.102343] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.151 00:00:57 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:11.151 00:00:57 rpc -- common/autotest_common.sh@862 -- # return 0 00:06:11.151 00:00:57 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:11.151 00:00:57 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:11.151 00:00:57 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:11.151 00:00:57 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:11.151 00:00:57 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:11.151 00:00:57 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.151 00:00:57 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:11.151 ************************************ 00:06:11.151 START TEST rpc_integrity 00:06:11.151 ************************************ 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:11.151 { 00:06:11.151 "name": "Malloc0", 00:06:11.151 "aliases": [ 00:06:11.151 "f914632e-7180-4488-befa-19562e407be1" 00:06:11.151 ], 00:06:11.151 "product_name": "Malloc disk", 00:06:11.151 "block_size": 512, 00:06:11.151 "num_blocks": 16384, 00:06:11.151 "uuid": "f914632e-7180-4488-befa-19562e407be1", 00:06:11.151 "assigned_rate_limits": { 00:06:11.151 "rw_ios_per_sec": 0, 00:06:11.151 "rw_mbytes_per_sec": 0, 00:06:11.151 "r_mbytes_per_sec": 0, 00:06:11.151 "w_mbytes_per_sec": 0 00:06:11.151 }, 00:06:11.151 "claimed": false, 00:06:11.151 "zoned": false, 00:06:11.151 "supported_io_types": { 00:06:11.151 "read": true, 00:06:11.151 "write": true, 00:06:11.151 "unmap": true, 00:06:11.151 "flush": true, 00:06:11.151 "reset": true, 00:06:11.151 "nvme_admin": false, 00:06:11.151 "nvme_io": false, 00:06:11.151 "nvme_io_md": false, 00:06:11.151 "write_zeroes": true, 00:06:11.151 "zcopy": true, 00:06:11.151 "get_zone_info": false, 00:06:11.151 "zone_management": false, 00:06:11.151 "zone_append": false, 00:06:11.151 "compare": false, 00:06:11.151 "compare_and_write": false, 00:06:11.151 "abort": true, 00:06:11.151 "seek_hole": false, 00:06:11.151 "seek_data": false, 00:06:11.151 "copy": true, 00:06:11.151 "nvme_iov_md": false 00:06:11.151 }, 00:06:11.151 "memory_domains": [ 00:06:11.151 { 00:06:11.151 "dma_device_id": "system", 00:06:11.151 "dma_device_type": 1 00:06:11.151 }, 00:06:11.151 { 00:06:11.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:11.151 "dma_device_type": 2 00:06:11.151 } 00:06:11.151 ], 00:06:11.151 "driver_specific": {} 00:06:11.151 } 00:06:11.151 ]' 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.151 [2024-07-16 00:00:57.982874] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:11.151 [2024-07-16 00:00:57.982919] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:11.151 [2024-07-16 00:00:57.982946] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x93deb0 00:06:11.151 [2024-07-16 00:00:57.982959] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:11.151 [2024-07-16 00:00:57.984471] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:11.151 [2024-07-16 00:00:57.984500] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:11.151 Passthru0 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.151 00:00:57 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.151 00:00:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.151 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.151 00:00:58 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:11.151 { 00:06:11.151 "name": "Malloc0", 00:06:11.151 "aliases": [ 00:06:11.151 "f914632e-7180-4488-befa-19562e407be1" 00:06:11.151 ], 00:06:11.151 "product_name": "Malloc disk", 00:06:11.151 "block_size": 512, 00:06:11.151 "num_blocks": 16384, 00:06:11.151 "uuid": "f914632e-7180-4488-befa-19562e407be1", 00:06:11.151 "assigned_rate_limits": { 00:06:11.151 "rw_ios_per_sec": 0, 00:06:11.151 "rw_mbytes_per_sec": 0, 00:06:11.151 "r_mbytes_per_sec": 0, 00:06:11.151 "w_mbytes_per_sec": 0 00:06:11.151 }, 00:06:11.151 "claimed": true, 00:06:11.151 "claim_type": "exclusive_write", 00:06:11.151 "zoned": false, 00:06:11.151 "supported_io_types": { 00:06:11.151 "read": true, 00:06:11.151 "write": true, 00:06:11.151 "unmap": true, 00:06:11.151 "flush": true, 00:06:11.151 "reset": true, 00:06:11.151 "nvme_admin": false, 00:06:11.151 "nvme_io": false, 00:06:11.151 "nvme_io_md": false, 00:06:11.151 "write_zeroes": true, 00:06:11.151 "zcopy": true, 00:06:11.151 "get_zone_info": false, 00:06:11.151 "zone_management": false, 00:06:11.151 "zone_append": false, 00:06:11.151 "compare": false, 00:06:11.151 "compare_and_write": false, 00:06:11.151 "abort": true, 00:06:11.151 "seek_hole": false, 00:06:11.151 "seek_data": false, 00:06:11.151 "copy": true, 00:06:11.151 "nvme_iov_md": false 00:06:11.151 }, 00:06:11.151 "memory_domains": [ 00:06:11.151 { 00:06:11.151 "dma_device_id": "system", 00:06:11.151 "dma_device_type": 1 00:06:11.151 }, 00:06:11.151 { 00:06:11.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:11.151 "dma_device_type": 2 00:06:11.151 } 00:06:11.151 ], 00:06:11.151 "driver_specific": {} 00:06:11.151 }, 00:06:11.151 { 00:06:11.151 "name": "Passthru0", 00:06:11.151 "aliases": [ 00:06:11.151 "845a72fb-b795-5558-8b6a-e271c12a7dc8" 00:06:11.151 ], 00:06:11.151 "product_name": "passthru", 00:06:11.151 "block_size": 512, 00:06:11.151 "num_blocks": 16384, 00:06:11.151 "uuid": "845a72fb-b795-5558-8b6a-e271c12a7dc8", 00:06:11.151 "assigned_rate_limits": { 00:06:11.151 "rw_ios_per_sec": 0, 00:06:11.151 "rw_mbytes_per_sec": 0, 00:06:11.151 "r_mbytes_per_sec": 0, 00:06:11.151 "w_mbytes_per_sec": 0 00:06:11.151 }, 00:06:11.151 "claimed": false, 00:06:11.151 "zoned": false, 00:06:11.151 "supported_io_types": { 00:06:11.151 "read": true, 00:06:11.151 "write": true, 00:06:11.151 "unmap": true, 00:06:11.151 "flush": true, 00:06:11.151 "reset": true, 00:06:11.151 "nvme_admin": false, 00:06:11.151 "nvme_io": false, 00:06:11.151 "nvme_io_md": false, 00:06:11.151 "write_zeroes": true, 00:06:11.151 "zcopy": true, 00:06:11.151 "get_zone_info": false, 00:06:11.151 "zone_management": false, 00:06:11.151 "zone_append": false, 00:06:11.151 "compare": false, 00:06:11.151 "compare_and_write": false, 00:06:11.151 "abort": true, 00:06:11.151 "seek_hole": false, 00:06:11.151 "seek_data": false, 00:06:11.151 "copy": true, 00:06:11.151 "nvme_iov_md": false 00:06:11.151 }, 00:06:11.151 "memory_domains": [ 00:06:11.151 { 00:06:11.151 "dma_device_id": "system", 00:06:11.151 "dma_device_type": 1 00:06:11.151 }, 00:06:11.151 { 00:06:11.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:11.151 "dma_device_type": 2 00:06:11.151 } 00:06:11.151 ], 00:06:11.151 "driver_specific": { 00:06:11.151 "passthru": { 00:06:11.151 "name": "Passthru0", 00:06:11.151 "base_bdev_name": "Malloc0" 00:06:11.151 } 00:06:11.151 } 00:06:11.151 } 00:06:11.151 ]' 00:06:11.151 00:00:58 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:11.151 00:00:58 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:11.151 00:00:58 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:11.151 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.151 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.151 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.151 00:00:58 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:11.151 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.151 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.151 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.151 00:00:58 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:11.151 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.151 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.151 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.151 00:00:58 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:11.409 00:00:58 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:11.409 00:00:58 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:11.409 00:06:11.409 real 0m0.299s 00:06:11.409 user 0m0.180s 00:06:11.409 sys 0m0.061s 00:06:11.409 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:11.409 00:00:58 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.409 ************************************ 00:06:11.409 END TEST rpc_integrity 00:06:11.409 ************************************ 00:06:11.409 00:00:58 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:11.409 00:00:58 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:11.409 00:00:58 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:11.409 00:00:58 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.409 00:00:58 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:11.409 ************************************ 00:06:11.409 START TEST rpc_plugins 00:06:11.409 ************************************ 00:06:11.409 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:06:11.409 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:11.409 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.409 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:11.409 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.409 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:11.409 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:11.409 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.409 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:11.409 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.409 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:11.409 { 00:06:11.409 "name": "Malloc1", 00:06:11.409 "aliases": [ 00:06:11.409 "53bc777d-e247-49b2-ae47-4bd092002cd9" 00:06:11.409 ], 00:06:11.409 "product_name": "Malloc disk", 00:06:11.409 "block_size": 4096, 00:06:11.409 "num_blocks": 256, 00:06:11.409 "uuid": "53bc777d-e247-49b2-ae47-4bd092002cd9", 00:06:11.409 "assigned_rate_limits": { 00:06:11.409 "rw_ios_per_sec": 0, 00:06:11.409 "rw_mbytes_per_sec": 0, 00:06:11.409 "r_mbytes_per_sec": 0, 00:06:11.409 "w_mbytes_per_sec": 0 00:06:11.409 }, 00:06:11.409 "claimed": false, 00:06:11.409 "zoned": false, 00:06:11.409 "supported_io_types": { 00:06:11.409 "read": true, 00:06:11.409 "write": true, 00:06:11.409 "unmap": true, 00:06:11.409 "flush": true, 00:06:11.409 "reset": true, 00:06:11.409 "nvme_admin": false, 00:06:11.409 "nvme_io": false, 00:06:11.409 "nvme_io_md": false, 00:06:11.409 "write_zeroes": true, 00:06:11.409 "zcopy": true, 00:06:11.409 "get_zone_info": false, 00:06:11.409 "zone_management": false, 00:06:11.409 "zone_append": false, 00:06:11.410 "compare": false, 00:06:11.410 "compare_and_write": false, 00:06:11.410 "abort": true, 00:06:11.410 "seek_hole": false, 00:06:11.410 "seek_data": false, 00:06:11.410 "copy": true, 00:06:11.410 "nvme_iov_md": false 00:06:11.410 }, 00:06:11.410 "memory_domains": [ 00:06:11.410 { 00:06:11.410 "dma_device_id": "system", 00:06:11.410 "dma_device_type": 1 00:06:11.410 }, 00:06:11.410 { 00:06:11.410 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:11.410 "dma_device_type": 2 00:06:11.410 } 00:06:11.410 ], 00:06:11.410 "driver_specific": {} 00:06:11.410 } 00:06:11.410 ]' 00:06:11.410 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:11.410 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:11.410 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:11.410 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.410 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:11.410 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.410 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:11.410 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.410 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:11.410 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.410 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:11.410 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:11.667 00:00:58 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:11.667 00:06:11.667 real 0m0.156s 00:06:11.667 user 0m0.095s 00:06:11.667 sys 0m0.027s 00:06:11.667 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:11.667 00:00:58 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:11.667 ************************************ 00:06:11.667 END TEST rpc_plugins 00:06:11.667 ************************************ 00:06:11.667 00:00:58 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:11.667 00:00:58 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:11.667 00:00:58 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:11.667 00:00:58 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.667 00:00:58 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:11.667 ************************************ 00:06:11.667 START TEST rpc_trace_cmd_test 00:06:11.667 ************************************ 00:06:11.667 00:00:58 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:06:11.667 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:11.667 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:11.667 00:00:58 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.667 00:00:58 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:11.667 00:00:58 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.667 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:11.667 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid3447063", 00:06:11.667 "tpoint_group_mask": "0x8", 00:06:11.667 "iscsi_conn": { 00:06:11.667 "mask": "0x2", 00:06:11.667 "tpoint_mask": "0x0" 00:06:11.667 }, 00:06:11.667 "scsi": { 00:06:11.667 "mask": "0x4", 00:06:11.667 "tpoint_mask": "0x0" 00:06:11.667 }, 00:06:11.667 "bdev": { 00:06:11.667 "mask": "0x8", 00:06:11.667 "tpoint_mask": "0xffffffffffffffff" 00:06:11.667 }, 00:06:11.667 "nvmf_rdma": { 00:06:11.667 "mask": "0x10", 00:06:11.667 "tpoint_mask": "0x0" 00:06:11.667 }, 00:06:11.667 "nvmf_tcp": { 00:06:11.667 "mask": "0x20", 00:06:11.667 "tpoint_mask": "0x0" 00:06:11.667 }, 00:06:11.667 "ftl": { 00:06:11.667 "mask": "0x40", 00:06:11.667 "tpoint_mask": "0x0" 00:06:11.667 }, 00:06:11.667 "blobfs": { 00:06:11.667 "mask": "0x80", 00:06:11.667 "tpoint_mask": "0x0" 00:06:11.667 }, 00:06:11.667 "dsa": { 00:06:11.667 "mask": "0x200", 00:06:11.667 "tpoint_mask": "0x0" 00:06:11.667 }, 00:06:11.667 "thread": { 00:06:11.667 "mask": "0x400", 00:06:11.667 "tpoint_mask": "0x0" 00:06:11.667 }, 00:06:11.667 "nvme_pcie": { 00:06:11.668 "mask": "0x800", 00:06:11.668 "tpoint_mask": "0x0" 00:06:11.668 }, 00:06:11.668 "iaa": { 00:06:11.668 "mask": "0x1000", 00:06:11.668 "tpoint_mask": "0x0" 00:06:11.668 }, 00:06:11.668 "nvme_tcp": { 00:06:11.668 "mask": "0x2000", 00:06:11.668 "tpoint_mask": "0x0" 00:06:11.668 }, 00:06:11.668 "bdev_nvme": { 00:06:11.668 "mask": "0x4000", 00:06:11.668 "tpoint_mask": "0x0" 00:06:11.668 }, 00:06:11.668 "sock": { 00:06:11.668 "mask": "0x8000", 00:06:11.668 "tpoint_mask": "0x0" 00:06:11.668 } 00:06:11.668 }' 00:06:11.668 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:11.668 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:11.668 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:11.668 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:11.668 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:11.925 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:11.925 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:11.925 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:11.925 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:11.925 00:00:58 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:11.925 00:06:11.925 real 0m0.249s 00:06:11.925 user 0m0.205s 00:06:11.925 sys 0m0.037s 00:06:11.925 00:00:58 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:11.925 00:00:58 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:11.925 ************************************ 00:06:11.925 END TEST rpc_trace_cmd_test 00:06:11.925 ************************************ 00:06:11.925 00:00:58 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:11.925 00:00:58 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:11.925 00:00:58 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:11.925 00:00:58 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:11.925 00:00:58 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:11.925 00:00:58 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.925 00:00:58 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:11.925 ************************************ 00:06:11.925 START TEST rpc_daemon_integrity 00:06:11.925 ************************************ 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.925 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:12.183 { 00:06:12.183 "name": "Malloc2", 00:06:12.183 "aliases": [ 00:06:12.183 "19ab02a6-4d81-4e3e-bc49-f53947d13de6" 00:06:12.183 ], 00:06:12.183 "product_name": "Malloc disk", 00:06:12.183 "block_size": 512, 00:06:12.183 "num_blocks": 16384, 00:06:12.183 "uuid": "19ab02a6-4d81-4e3e-bc49-f53947d13de6", 00:06:12.183 "assigned_rate_limits": { 00:06:12.183 "rw_ios_per_sec": 0, 00:06:12.183 "rw_mbytes_per_sec": 0, 00:06:12.183 "r_mbytes_per_sec": 0, 00:06:12.183 "w_mbytes_per_sec": 0 00:06:12.183 }, 00:06:12.183 "claimed": false, 00:06:12.183 "zoned": false, 00:06:12.183 "supported_io_types": { 00:06:12.183 "read": true, 00:06:12.183 "write": true, 00:06:12.183 "unmap": true, 00:06:12.183 "flush": true, 00:06:12.183 "reset": true, 00:06:12.183 "nvme_admin": false, 00:06:12.183 "nvme_io": false, 00:06:12.183 "nvme_io_md": false, 00:06:12.183 "write_zeroes": true, 00:06:12.183 "zcopy": true, 00:06:12.183 "get_zone_info": false, 00:06:12.183 "zone_management": false, 00:06:12.183 "zone_append": false, 00:06:12.183 "compare": false, 00:06:12.183 "compare_and_write": false, 00:06:12.183 "abort": true, 00:06:12.183 "seek_hole": false, 00:06:12.183 "seek_data": false, 00:06:12.183 "copy": true, 00:06:12.183 "nvme_iov_md": false 00:06:12.183 }, 00:06:12.183 "memory_domains": [ 00:06:12.183 { 00:06:12.183 "dma_device_id": "system", 00:06:12.183 "dma_device_type": 1 00:06:12.183 }, 00:06:12.183 { 00:06:12.183 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:12.183 "dma_device_type": 2 00:06:12.183 } 00:06:12.183 ], 00:06:12.183 "driver_specific": {} 00:06:12.183 } 00:06:12.183 ]' 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:12.183 [2024-07-16 00:00:58.933583] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:12.183 [2024-07-16 00:00:58.933621] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:12.183 [2024-07-16 00:00:58.933639] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x93eb20 00:06:12.183 [2024-07-16 00:00:58.933652] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:12.183 [2024-07-16 00:00:58.935010] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:12.183 [2024-07-16 00:00:58.935037] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:12.183 Passthru0 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:12.183 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:12.183 { 00:06:12.183 "name": "Malloc2", 00:06:12.183 "aliases": [ 00:06:12.183 "19ab02a6-4d81-4e3e-bc49-f53947d13de6" 00:06:12.183 ], 00:06:12.183 "product_name": "Malloc disk", 00:06:12.183 "block_size": 512, 00:06:12.183 "num_blocks": 16384, 00:06:12.184 "uuid": "19ab02a6-4d81-4e3e-bc49-f53947d13de6", 00:06:12.184 "assigned_rate_limits": { 00:06:12.184 "rw_ios_per_sec": 0, 00:06:12.184 "rw_mbytes_per_sec": 0, 00:06:12.184 "r_mbytes_per_sec": 0, 00:06:12.184 "w_mbytes_per_sec": 0 00:06:12.184 }, 00:06:12.184 "claimed": true, 00:06:12.184 "claim_type": "exclusive_write", 00:06:12.184 "zoned": false, 00:06:12.184 "supported_io_types": { 00:06:12.184 "read": true, 00:06:12.184 "write": true, 00:06:12.184 "unmap": true, 00:06:12.184 "flush": true, 00:06:12.184 "reset": true, 00:06:12.184 "nvme_admin": false, 00:06:12.184 "nvme_io": false, 00:06:12.184 "nvme_io_md": false, 00:06:12.184 "write_zeroes": true, 00:06:12.184 "zcopy": true, 00:06:12.184 "get_zone_info": false, 00:06:12.184 "zone_management": false, 00:06:12.184 "zone_append": false, 00:06:12.184 "compare": false, 00:06:12.184 "compare_and_write": false, 00:06:12.184 "abort": true, 00:06:12.184 "seek_hole": false, 00:06:12.184 "seek_data": false, 00:06:12.184 "copy": true, 00:06:12.184 "nvme_iov_md": false 00:06:12.184 }, 00:06:12.184 "memory_domains": [ 00:06:12.184 { 00:06:12.184 "dma_device_id": "system", 00:06:12.184 "dma_device_type": 1 00:06:12.184 }, 00:06:12.184 { 00:06:12.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:12.184 "dma_device_type": 2 00:06:12.184 } 00:06:12.184 ], 00:06:12.184 "driver_specific": {} 00:06:12.184 }, 00:06:12.184 { 00:06:12.184 "name": "Passthru0", 00:06:12.184 "aliases": [ 00:06:12.184 "51d7bc94-686c-54c6-b6d1-952cdfb0e7d3" 00:06:12.184 ], 00:06:12.184 "product_name": "passthru", 00:06:12.184 "block_size": 512, 00:06:12.184 "num_blocks": 16384, 00:06:12.184 "uuid": "51d7bc94-686c-54c6-b6d1-952cdfb0e7d3", 00:06:12.184 "assigned_rate_limits": { 00:06:12.184 "rw_ios_per_sec": 0, 00:06:12.184 "rw_mbytes_per_sec": 0, 00:06:12.184 "r_mbytes_per_sec": 0, 00:06:12.184 "w_mbytes_per_sec": 0 00:06:12.184 }, 00:06:12.184 "claimed": false, 00:06:12.184 "zoned": false, 00:06:12.184 "supported_io_types": { 00:06:12.184 "read": true, 00:06:12.184 "write": true, 00:06:12.184 "unmap": true, 00:06:12.184 "flush": true, 00:06:12.184 "reset": true, 00:06:12.184 "nvme_admin": false, 00:06:12.184 "nvme_io": false, 00:06:12.184 "nvme_io_md": false, 00:06:12.184 "write_zeroes": true, 00:06:12.184 "zcopy": true, 00:06:12.184 "get_zone_info": false, 00:06:12.184 "zone_management": false, 00:06:12.184 "zone_append": false, 00:06:12.184 "compare": false, 00:06:12.184 "compare_and_write": false, 00:06:12.184 "abort": true, 00:06:12.184 "seek_hole": false, 00:06:12.184 "seek_data": false, 00:06:12.184 "copy": true, 00:06:12.184 "nvme_iov_md": false 00:06:12.184 }, 00:06:12.184 "memory_domains": [ 00:06:12.184 { 00:06:12.184 "dma_device_id": "system", 00:06:12.184 "dma_device_type": 1 00:06:12.184 }, 00:06:12.184 { 00:06:12.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:12.184 "dma_device_type": 2 00:06:12.184 } 00:06:12.184 ], 00:06:12.184 "driver_specific": { 00:06:12.184 "passthru": { 00:06:12.184 "name": "Passthru0", 00:06:12.184 "base_bdev_name": "Malloc2" 00:06:12.184 } 00:06:12.184 } 00:06:12.184 } 00:06:12.184 ]' 00:06:12.184 00:00:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:12.184 00:06:12.184 real 0m0.301s 00:06:12.184 user 0m0.197s 00:06:12.184 sys 0m0.044s 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:12.184 00:00:59 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:12.184 ************************************ 00:06:12.184 END TEST rpc_daemon_integrity 00:06:12.184 ************************************ 00:06:12.184 00:00:59 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:12.184 00:00:59 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:12.184 00:00:59 rpc -- rpc/rpc.sh@84 -- # killprocess 3447063 00:06:12.184 00:00:59 rpc -- common/autotest_common.sh@948 -- # '[' -z 3447063 ']' 00:06:12.184 00:00:59 rpc -- common/autotest_common.sh@952 -- # kill -0 3447063 00:06:12.443 00:00:59 rpc -- common/autotest_common.sh@953 -- # uname 00:06:12.443 00:00:59 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:12.443 00:00:59 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3447063 00:06:12.443 00:00:59 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:12.443 00:00:59 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:12.443 00:00:59 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3447063' 00:06:12.443 killing process with pid 3447063 00:06:12.443 00:00:59 rpc -- common/autotest_common.sh@967 -- # kill 3447063 00:06:12.443 00:00:59 rpc -- common/autotest_common.sh@972 -- # wait 3447063 00:06:12.701 00:06:12.701 real 0m2.866s 00:06:12.701 user 0m3.646s 00:06:12.701 sys 0m0.916s 00:06:12.701 00:00:59 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:12.701 00:00:59 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:12.701 ************************************ 00:06:12.701 END TEST rpc 00:06:12.701 ************************************ 00:06:12.701 00:00:59 -- common/autotest_common.sh@1142 -- # return 0 00:06:12.701 00:00:59 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:12.701 00:00:59 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:12.701 00:00:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.701 00:00:59 -- common/autotest_common.sh@10 -- # set +x 00:06:12.961 ************************************ 00:06:12.961 START TEST skip_rpc 00:06:12.961 ************************************ 00:06:12.961 00:00:59 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:12.961 * Looking for test storage... 00:06:12.961 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:12.961 00:00:59 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:12.961 00:00:59 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:12.961 00:00:59 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:12.961 00:00:59 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:12.961 00:00:59 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.961 00:00:59 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:12.961 ************************************ 00:06:12.961 START TEST skip_rpc 00:06:12.961 ************************************ 00:06:12.961 00:00:59 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:06:12.961 00:00:59 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=3447597 00:06:12.961 00:00:59 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:12.961 00:00:59 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:12.961 00:00:59 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:13.220 [2024-07-16 00:00:59.942214] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:06:13.220 [2024-07-16 00:00:59.942355] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3447597 ] 00:06:13.220 [2024-07-16 00:01:00.145836] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.480 [2024-07-16 00:01:00.249609] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 3447597 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 3447597 ']' 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 3447597 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3447597 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3447597' 00:06:18.787 killing process with pid 3447597 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 3447597 00:06:18.787 00:01:04 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 3447597 00:06:18.787 00:06:18.787 real 0m5.449s 00:06:18.787 user 0m5.012s 00:06:18.787 sys 0m0.454s 00:06:18.787 00:01:05 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:18.787 00:01:05 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:18.787 ************************************ 00:06:18.787 END TEST skip_rpc 00:06:18.787 ************************************ 00:06:18.787 00:01:05 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:18.787 00:01:05 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:18.787 00:01:05 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:18.787 00:01:05 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:18.787 00:01:05 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:18.787 ************************************ 00:06:18.787 START TEST skip_rpc_with_json 00:06:18.787 ************************************ 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=3448332 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 3448332 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 3448332 ']' 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:18.787 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:18.787 00:01:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:18.787 [2024-07-16 00:01:05.419535] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:06:18.787 [2024-07-16 00:01:05.419613] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3448332 ] 00:06:18.787 [2024-07-16 00:01:05.549299] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.787 [2024-07-16 00:01:05.650978] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:19.354 [2024-07-16 00:01:06.213321] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:19.354 request: 00:06:19.354 { 00:06:19.354 "trtype": "tcp", 00:06:19.354 "method": "nvmf_get_transports", 00:06:19.354 "req_id": 1 00:06:19.354 } 00:06:19.354 Got JSON-RPC error response 00:06:19.354 response: 00:06:19.354 { 00:06:19.354 "code": -19, 00:06:19.354 "message": "No such device" 00:06:19.354 } 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:19.354 [2024-07-16 00:01:06.225464] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:19.354 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:19.612 { 00:06:19.612 "subsystems": [ 00:06:19.612 { 00:06:19.612 "subsystem": "keyring", 00:06:19.612 "config": [] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "iobuf", 00:06:19.612 "config": [ 00:06:19.612 { 00:06:19.612 "method": "iobuf_set_options", 00:06:19.612 "params": { 00:06:19.612 "small_pool_count": 8192, 00:06:19.612 "large_pool_count": 1024, 00:06:19.612 "small_bufsize": 8192, 00:06:19.612 "large_bufsize": 135168 00:06:19.612 } 00:06:19.612 } 00:06:19.612 ] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "sock", 00:06:19.612 "config": [ 00:06:19.612 { 00:06:19.612 "method": "sock_set_default_impl", 00:06:19.612 "params": { 00:06:19.612 "impl_name": "posix" 00:06:19.612 } 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "method": "sock_impl_set_options", 00:06:19.612 "params": { 00:06:19.612 "impl_name": "ssl", 00:06:19.612 "recv_buf_size": 4096, 00:06:19.612 "send_buf_size": 4096, 00:06:19.612 "enable_recv_pipe": true, 00:06:19.612 "enable_quickack": false, 00:06:19.612 "enable_placement_id": 0, 00:06:19.612 "enable_zerocopy_send_server": true, 00:06:19.612 "enable_zerocopy_send_client": false, 00:06:19.612 "zerocopy_threshold": 0, 00:06:19.612 "tls_version": 0, 00:06:19.612 "enable_ktls": false 00:06:19.612 } 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "method": "sock_impl_set_options", 00:06:19.612 "params": { 00:06:19.612 "impl_name": "posix", 00:06:19.612 "recv_buf_size": 2097152, 00:06:19.612 "send_buf_size": 2097152, 00:06:19.612 "enable_recv_pipe": true, 00:06:19.612 "enable_quickack": false, 00:06:19.612 "enable_placement_id": 0, 00:06:19.612 "enable_zerocopy_send_server": true, 00:06:19.612 "enable_zerocopy_send_client": false, 00:06:19.612 "zerocopy_threshold": 0, 00:06:19.612 "tls_version": 0, 00:06:19.612 "enable_ktls": false 00:06:19.612 } 00:06:19.612 } 00:06:19.612 ] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "vmd", 00:06:19.612 "config": [] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "accel", 00:06:19.612 "config": [ 00:06:19.612 { 00:06:19.612 "method": "accel_set_options", 00:06:19.612 "params": { 00:06:19.612 "small_cache_size": 128, 00:06:19.612 "large_cache_size": 16, 00:06:19.612 "task_count": 2048, 00:06:19.612 "sequence_count": 2048, 00:06:19.612 "buf_count": 2048 00:06:19.612 } 00:06:19.612 } 00:06:19.612 ] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "bdev", 00:06:19.612 "config": [ 00:06:19.612 { 00:06:19.612 "method": "bdev_set_options", 00:06:19.612 "params": { 00:06:19.612 "bdev_io_pool_size": 65535, 00:06:19.612 "bdev_io_cache_size": 256, 00:06:19.612 "bdev_auto_examine": true, 00:06:19.612 "iobuf_small_cache_size": 128, 00:06:19.612 "iobuf_large_cache_size": 16 00:06:19.612 } 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "method": "bdev_raid_set_options", 00:06:19.612 "params": { 00:06:19.612 "process_window_size_kb": 1024 00:06:19.612 } 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "method": "bdev_iscsi_set_options", 00:06:19.612 "params": { 00:06:19.612 "timeout_sec": 30 00:06:19.612 } 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "method": "bdev_nvme_set_options", 00:06:19.612 "params": { 00:06:19.612 "action_on_timeout": "none", 00:06:19.612 "timeout_us": 0, 00:06:19.612 "timeout_admin_us": 0, 00:06:19.612 "keep_alive_timeout_ms": 10000, 00:06:19.612 "arbitration_burst": 0, 00:06:19.612 "low_priority_weight": 0, 00:06:19.612 "medium_priority_weight": 0, 00:06:19.612 "high_priority_weight": 0, 00:06:19.612 "nvme_adminq_poll_period_us": 10000, 00:06:19.612 "nvme_ioq_poll_period_us": 0, 00:06:19.612 "io_queue_requests": 0, 00:06:19.612 "delay_cmd_submit": true, 00:06:19.612 "transport_retry_count": 4, 00:06:19.612 "bdev_retry_count": 3, 00:06:19.612 "transport_ack_timeout": 0, 00:06:19.612 "ctrlr_loss_timeout_sec": 0, 00:06:19.612 "reconnect_delay_sec": 0, 00:06:19.612 "fast_io_fail_timeout_sec": 0, 00:06:19.612 "disable_auto_failback": false, 00:06:19.612 "generate_uuids": false, 00:06:19.612 "transport_tos": 0, 00:06:19.612 "nvme_error_stat": false, 00:06:19.612 "rdma_srq_size": 0, 00:06:19.612 "io_path_stat": false, 00:06:19.612 "allow_accel_sequence": false, 00:06:19.612 "rdma_max_cq_size": 0, 00:06:19.612 "rdma_cm_event_timeout_ms": 0, 00:06:19.612 "dhchap_digests": [ 00:06:19.612 "sha256", 00:06:19.612 "sha384", 00:06:19.612 "sha512" 00:06:19.612 ], 00:06:19.612 "dhchap_dhgroups": [ 00:06:19.612 "null", 00:06:19.612 "ffdhe2048", 00:06:19.612 "ffdhe3072", 00:06:19.612 "ffdhe4096", 00:06:19.612 "ffdhe6144", 00:06:19.612 "ffdhe8192" 00:06:19.612 ] 00:06:19.612 } 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "method": "bdev_nvme_set_hotplug", 00:06:19.612 "params": { 00:06:19.612 "period_us": 100000, 00:06:19.612 "enable": false 00:06:19.612 } 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "method": "bdev_wait_for_examine" 00:06:19.612 } 00:06:19.612 ] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "scsi", 00:06:19.612 "config": null 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "scheduler", 00:06:19.612 "config": [ 00:06:19.612 { 00:06:19.612 "method": "framework_set_scheduler", 00:06:19.612 "params": { 00:06:19.612 "name": "static" 00:06:19.612 } 00:06:19.612 } 00:06:19.612 ] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "vhost_scsi", 00:06:19.612 "config": [] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "vhost_blk", 00:06:19.612 "config": [] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "ublk", 00:06:19.612 "config": [] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "nbd", 00:06:19.612 "config": [] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "nvmf", 00:06:19.612 "config": [ 00:06:19.612 { 00:06:19.612 "method": "nvmf_set_config", 00:06:19.612 "params": { 00:06:19.612 "discovery_filter": "match_any", 00:06:19.612 "admin_cmd_passthru": { 00:06:19.612 "identify_ctrlr": false 00:06:19.612 } 00:06:19.612 } 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "method": "nvmf_set_max_subsystems", 00:06:19.612 "params": { 00:06:19.612 "max_subsystems": 1024 00:06:19.612 } 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "method": "nvmf_set_crdt", 00:06:19.612 "params": { 00:06:19.612 "crdt1": 0, 00:06:19.612 "crdt2": 0, 00:06:19.612 "crdt3": 0 00:06:19.612 } 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "method": "nvmf_create_transport", 00:06:19.612 "params": { 00:06:19.612 "trtype": "TCP", 00:06:19.612 "max_queue_depth": 128, 00:06:19.612 "max_io_qpairs_per_ctrlr": 127, 00:06:19.612 "in_capsule_data_size": 4096, 00:06:19.612 "max_io_size": 131072, 00:06:19.612 "io_unit_size": 131072, 00:06:19.612 "max_aq_depth": 128, 00:06:19.612 "num_shared_buffers": 511, 00:06:19.612 "buf_cache_size": 4294967295, 00:06:19.612 "dif_insert_or_strip": false, 00:06:19.612 "zcopy": false, 00:06:19.612 "c2h_success": true, 00:06:19.612 "sock_priority": 0, 00:06:19.612 "abort_timeout_sec": 1, 00:06:19.612 "ack_timeout": 0, 00:06:19.612 "data_wr_pool_size": 0 00:06:19.612 } 00:06:19.612 } 00:06:19.612 ] 00:06:19.612 }, 00:06:19.612 { 00:06:19.612 "subsystem": "iscsi", 00:06:19.612 "config": [ 00:06:19.612 { 00:06:19.612 "method": "iscsi_set_options", 00:06:19.612 "params": { 00:06:19.612 "node_base": "iqn.2016-06.io.spdk", 00:06:19.612 "max_sessions": 128, 00:06:19.612 "max_connections_per_session": 2, 00:06:19.612 "max_queue_depth": 64, 00:06:19.612 "default_time2wait": 2, 00:06:19.612 "default_time2retain": 20, 00:06:19.612 "first_burst_length": 8192, 00:06:19.612 "immediate_data": true, 00:06:19.612 "allow_duplicated_isid": false, 00:06:19.612 "error_recovery_level": 0, 00:06:19.612 "nop_timeout": 60, 00:06:19.612 "nop_in_interval": 30, 00:06:19.612 "disable_chap": false, 00:06:19.612 "require_chap": false, 00:06:19.612 "mutual_chap": false, 00:06:19.612 "chap_group": 0, 00:06:19.612 "max_large_datain_per_connection": 64, 00:06:19.612 "max_r2t_per_connection": 4, 00:06:19.612 "pdu_pool_size": 36864, 00:06:19.612 "immediate_data_pool_size": 16384, 00:06:19.612 "data_out_pool_size": 2048 00:06:19.612 } 00:06:19.612 } 00:06:19.612 ] 00:06:19.612 } 00:06:19.612 ] 00:06:19.612 } 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 3448332 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 3448332 ']' 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 3448332 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3448332 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3448332' 00:06:19.612 killing process with pid 3448332 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 3448332 00:06:19.612 00:01:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 3448332 00:06:20.179 00:01:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=3448514 00:06:20.179 00:01:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:20.179 00:01:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 3448514 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 3448514 ']' 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 3448514 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3448514 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3448514' 00:06:25.451 killing process with pid 3448514 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 3448514 00:06:25.451 00:01:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 3448514 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:25.451 00:06:25.451 real 0m6.916s 00:06:25.451 user 0m6.508s 00:06:25.451 sys 0m0.833s 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:25.451 ************************************ 00:06:25.451 END TEST skip_rpc_with_json 00:06:25.451 ************************************ 00:06:25.451 00:01:12 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:25.451 00:01:12 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:25.451 00:01:12 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:25.451 00:01:12 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:25.451 00:01:12 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:25.451 ************************************ 00:06:25.451 START TEST skip_rpc_with_delay 00:06:25.451 ************************************ 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:25.451 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:25.708 [2024-07-16 00:01:12.421700] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:25.708 [2024-07-16 00:01:12.421798] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:25.708 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:25.708 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:25.708 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:25.708 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:25.708 00:06:25.708 real 0m0.093s 00:06:25.708 user 0m0.054s 00:06:25.708 sys 0m0.038s 00:06:25.708 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:25.708 00:01:12 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:25.708 ************************************ 00:06:25.708 END TEST skip_rpc_with_delay 00:06:25.709 ************************************ 00:06:25.709 00:01:12 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:25.709 00:01:12 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:25.709 00:01:12 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:25.709 00:01:12 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:25.709 00:01:12 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:25.709 00:01:12 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:25.709 00:01:12 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:25.709 ************************************ 00:06:25.709 START TEST exit_on_failed_rpc_init 00:06:25.709 ************************************ 00:06:25.709 00:01:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:06:25.709 00:01:12 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=3449280 00:06:25.709 00:01:12 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 3449280 00:06:25.709 00:01:12 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:25.709 00:01:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 3449280 ']' 00:06:25.709 00:01:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.709 00:01:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:25.709 00:01:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.709 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.709 00:01:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:25.709 00:01:12 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:25.966 [2024-07-16 00:01:12.661465] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:06:25.966 [2024-07-16 00:01:12.661603] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3449280 ] 00:06:25.966 [2024-07-16 00:01:12.859219] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.224 [2024-07-16 00:01:12.960533] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:26.788 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:26.788 [2024-07-16 00:01:13.595854] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:06:26.788 [2024-07-16 00:01:13.595940] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3449455 ] 00:06:26.788 [2024-07-16 00:01:13.730794] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.047 [2024-07-16 00:01:13.843326] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:27.047 [2024-07-16 00:01:13.843421] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:27.047 [2024-07-16 00:01:13.843442] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:27.047 [2024-07-16 00:01:13.843458] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 3449280 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 3449280 ']' 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 3449280 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:27.047 00:01:13 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3449280 00:06:27.303 00:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:27.303 00:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:27.303 00:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3449280' 00:06:27.303 killing process with pid 3449280 00:06:27.303 00:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 3449280 00:06:27.303 00:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 3449280 00:06:27.561 00:06:27.561 real 0m1.862s 00:06:27.561 user 0m2.118s 00:06:27.561 sys 0m0.678s 00:06:27.561 00:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:27.561 00:01:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:27.561 ************************************ 00:06:27.561 END TEST exit_on_failed_rpc_init 00:06:27.561 ************************************ 00:06:27.561 00:01:14 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:27.561 00:01:14 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:27.562 00:06:27.562 real 0m14.780s 00:06:27.562 user 0m13.864s 00:06:27.562 sys 0m2.326s 00:06:27.562 00:01:14 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:27.562 00:01:14 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:27.562 ************************************ 00:06:27.562 END TEST skip_rpc 00:06:27.562 ************************************ 00:06:27.562 00:01:14 -- common/autotest_common.sh@1142 -- # return 0 00:06:27.562 00:01:14 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:27.562 00:01:14 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:27.562 00:01:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.562 00:01:14 -- common/autotest_common.sh@10 -- # set +x 00:06:27.820 ************************************ 00:06:27.820 START TEST rpc_client 00:06:27.820 ************************************ 00:06:27.820 00:01:14 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:27.820 * Looking for test storage... 00:06:27.821 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:27.821 00:01:14 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:27.821 OK 00:06:27.821 00:01:14 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:27.821 00:06:27.821 real 0m0.137s 00:06:27.821 user 0m0.069s 00:06:27.821 sys 0m0.079s 00:06:27.821 00:01:14 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:27.821 00:01:14 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:27.821 ************************************ 00:06:27.821 END TEST rpc_client 00:06:27.821 ************************************ 00:06:27.821 00:01:14 -- common/autotest_common.sh@1142 -- # return 0 00:06:27.821 00:01:14 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:27.821 00:01:14 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:27.821 00:01:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.821 00:01:14 -- common/autotest_common.sh@10 -- # set +x 00:06:27.821 ************************************ 00:06:27.821 START TEST json_config 00:06:27.821 ************************************ 00:06:27.821 00:01:14 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:28.080 00:01:14 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:28.080 00:01:14 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:28.080 00:01:14 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:28.080 00:01:14 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:28.080 00:01:14 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:28.080 00:01:14 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:28.080 00:01:14 json_config -- paths/export.sh@5 -- # export PATH 00:06:28.080 00:01:14 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@47 -- # : 0 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:28.080 00:01:14 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:28.080 INFO: JSON configuration test init 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:28.080 00:01:14 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:28.080 00:01:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:28.080 00:01:14 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:28.080 00:01:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:28.080 00:01:14 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:28.080 00:01:14 json_config -- json_config/common.sh@9 -- # local app=target 00:06:28.080 00:01:14 json_config -- json_config/common.sh@10 -- # shift 00:06:28.080 00:01:14 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:28.080 00:01:14 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:28.080 00:01:14 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:28.080 00:01:14 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:28.080 00:01:14 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:28.080 00:01:14 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=3449742 00:06:28.080 00:01:14 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:28.080 Waiting for target to run... 00:06:28.080 00:01:14 json_config -- json_config/common.sh@25 -- # waitforlisten 3449742 /var/tmp/spdk_tgt.sock 00:06:28.080 00:01:14 json_config -- common/autotest_common.sh@829 -- # '[' -z 3449742 ']' 00:06:28.080 00:01:14 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:28.080 00:01:14 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:28.080 00:01:14 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:28.080 00:01:14 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:28.080 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:28.080 00:01:14 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:28.080 00:01:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:28.080 [2024-07-16 00:01:14.943958] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:06:28.080 [2024-07-16 00:01:14.944038] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3449742 ] 00:06:28.647 [2024-07-16 00:01:15.547387] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.906 [2024-07-16 00:01:15.652708] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.164 00:01:15 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:29.164 00:01:15 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:29.164 00:01:15 json_config -- json_config/common.sh@26 -- # echo '' 00:06:29.164 00:06:29.164 00:01:15 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:29.164 00:01:15 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:29.164 00:01:15 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:29.164 00:01:15 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:29.164 00:01:15 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:29.164 00:01:15 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:29.164 00:01:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:29.422 00:01:16 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:29.422 00:01:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:29.422 [2024-07-16 00:01:16.342894] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:29.422 00:01:16 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:29.422 00:01:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:29.679 [2024-07-16 00:01:16.591535] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:29.679 00:01:16 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:29.679 00:01:16 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:29.679 00:01:16 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:29.938 00:01:16 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:29.938 00:01:16 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:29.938 00:01:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:30.195 [2024-07-16 00:01:16.905053] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:32.732 00:01:19 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:32.732 00:01:19 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:32.732 00:01:19 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:32.732 00:01:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:32.732 00:01:19 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:32.732 00:01:19 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:32.732 00:01:19 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:32.732 00:01:19 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:32.732 00:01:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:32.732 00:01:19 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:32.991 00:01:19 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:32.991 00:01:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:32.991 00:01:19 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:32.991 00:01:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:32.991 00:01:19 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:32.991 00:01:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:33.251 00:01:20 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:33.251 00:01:20 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:33.251 00:01:20 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:33.251 00:01:20 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:33.251 00:01:20 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:33.251 00:01:20 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:33.251 00:01:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:33.510 Nvme0n1p0 Nvme0n1p1 00:06:33.510 00:01:20 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:33.510 00:01:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:33.769 [2024-07-16 00:01:20.514665] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:33.769 [2024-07-16 00:01:20.514731] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:33.769 00:06:33.769 00:01:20 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:33.769 00:01:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:34.028 Malloc3 00:06:34.028 00:01:20 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:34.028 00:01:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:34.287 [2024-07-16 00:01:21.004045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:34.287 [2024-07-16 00:01:21.004097] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:34.287 [2024-07-16 00:01:21.004123] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1309a00 00:06:34.287 [2024-07-16 00:01:21.004136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:34.287 [2024-07-16 00:01:21.005754] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:34.287 [2024-07-16 00:01:21.005784] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:34.287 PTBdevFromMalloc3 00:06:34.287 00:01:21 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:34.287 00:01:21 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:34.556 Null0 00:06:34.556 00:01:21 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:34.556 00:01:21 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:34.556 Malloc0 00:06:34.556 00:01:21 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:34.556 00:01:21 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:34.815 Malloc1 00:06:34.815 00:01:21 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:34.815 00:01:21 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:35.382 102400+0 records in 00:06:35.382 102400+0 records out 00:06:35.382 104857600 bytes (105 MB, 100 MiB) copied, 0.30975 s, 339 MB/s 00:06:35.382 00:01:22 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:35.382 00:01:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:35.382 aio_disk 00:06:35.382 00:01:22 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:35.382 00:01:22 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:35.382 00:01:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:40.701 7e0f0f59-ebda-4c2e-b6c2-796cac31cf8c 00:06:40.701 00:01:27 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:40.701 00:01:27 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:40.701 00:01:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:40.701 00:01:27 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:40.701 00:01:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:40.701 00:01:27 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:40.701 00:01:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:40.959 00:01:27 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:40.959 00:01:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:41.217 00:01:27 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:41.217 00:01:27 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:41.217 00:01:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:41.474 MallocForCryptoBdev 00:06:41.474 00:01:28 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:41.474 00:01:28 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:41.474 00:01:28 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:41.474 00:01:28 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:41.474 00:01:28 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:41.474 00:01:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:42.040 [2024-07-16 00:01:28.725883] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:42.040 CryptoMallocBdev 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:22158e3b-b3d7-4288-9553-07fc29bbf045 bdev_register:82c56201-1e8a-422d-bde8-ffaeaddace4d bdev_register:1e8bc889-3300-440f-8e30-39420ad3e054 bdev_register:cae666b0-4365-4551-9b73-5b3b1282973d bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:22158e3b-b3d7-4288-9553-07fc29bbf045 bdev_register:82c56201-1e8a-422d-bde8-ffaeaddace4d bdev_register:1e8bc889-3300-440f-8e30-39420ad3e054 bdev_register:cae666b0-4365-4551-9b73-5b3b1282973d bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@71 -- # sort 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@72 -- # sort 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:42.040 00:01:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:42.040 00:01:28 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:22158e3b-b3d7-4288-9553-07fc29bbf045 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:82c56201-1e8a-422d-bde8-ffaeaddace4d 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:1e8bc889-3300-440f-8e30-39420ad3e054 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:cae666b0-4365-4551-9b73-5b3b1282973d 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:1e8bc889-3300-440f-8e30-39420ad3e054 bdev_register:22158e3b-b3d7-4288-9553-07fc29bbf045 bdev_register:82c56201-1e8a-422d-bde8-ffaeaddace4d bdev_register:aio_disk bdev_register:cae666b0-4365-4551-9b73-5b3b1282973d bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\1\e\8\b\c\8\8\9\-\3\3\0\0\-\4\4\0\f\-\8\e\3\0\-\3\9\4\2\0\a\d\3\e\0\5\4\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\2\2\1\5\8\e\3\b\-\b\3\d\7\-\4\2\8\8\-\9\5\5\3\-\0\7\f\c\2\9\b\b\f\0\4\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\8\2\c\5\6\2\0\1\-\1\e\8\a\-\4\2\2\d\-\b\d\e\8\-\f\f\a\e\a\d\d\a\c\e\4\d\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\c\a\e\6\6\6\b\0\-\4\3\6\5\-\4\5\5\1\-\9\b\7\3\-\5\b\3\b\1\2\8\2\9\7\3\d\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@86 -- # cat 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:1e8bc889-3300-440f-8e30-39420ad3e054 bdev_register:22158e3b-b3d7-4288-9553-07fc29bbf045 bdev_register:82c56201-1e8a-422d-bde8-ffaeaddace4d bdev_register:aio_disk bdev_register:cae666b0-4365-4551-9b73-5b3b1282973d bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:42.300 Expected events matched: 00:06:42.300 bdev_register:1e8bc889-3300-440f-8e30-39420ad3e054 00:06:42.300 bdev_register:22158e3b-b3d7-4288-9553-07fc29bbf045 00:06:42.300 bdev_register:82c56201-1e8a-422d-bde8-ffaeaddace4d 00:06:42.300 bdev_register:aio_disk 00:06:42.300 bdev_register:cae666b0-4365-4551-9b73-5b3b1282973d 00:06:42.300 bdev_register:CryptoMallocBdev 00:06:42.300 bdev_register:Malloc0 00:06:42.300 bdev_register:Malloc0p0 00:06:42.300 bdev_register:Malloc0p1 00:06:42.300 bdev_register:Malloc0p2 00:06:42.300 bdev_register:Malloc1 00:06:42.300 bdev_register:Malloc3 00:06:42.300 bdev_register:MallocForCryptoBdev 00:06:42.300 bdev_register:Null0 00:06:42.300 bdev_register:Nvme0n1 00:06:42.300 bdev_register:Nvme0n1p0 00:06:42.300 bdev_register:Nvme0n1p1 00:06:42.300 bdev_register:PTBdevFromMalloc3 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:42.300 00:01:29 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:42.300 00:01:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:42.300 00:01:29 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:42.300 00:01:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:42.300 00:01:29 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:42.301 00:01:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:42.559 MallocBdevForConfigChangeCheck 00:06:42.559 00:01:29 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:42.559 00:01:29 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:42.559 00:01:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:42.559 00:01:29 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:42.559 00:01:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:42.818 00:01:29 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:42.818 INFO: shutting down applications... 00:06:42.818 00:01:29 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:42.818 00:01:29 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:42.818 00:01:29 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:42.818 00:01:29 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:43.077 [2024-07-16 00:01:29.925583] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:46.361 Calling clear_iscsi_subsystem 00:06:46.361 Calling clear_nvmf_subsystem 00:06:46.361 Calling clear_nbd_subsystem 00:06:46.361 Calling clear_ublk_subsystem 00:06:46.361 Calling clear_vhost_blk_subsystem 00:06:46.361 Calling clear_vhost_scsi_subsystem 00:06:46.361 Calling clear_bdev_subsystem 00:06:46.361 00:01:32 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:46.361 00:01:32 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:46.361 00:01:32 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:46.361 00:01:32 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:46.361 00:01:32 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:46.361 00:01:32 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:46.361 00:01:33 json_config -- json_config/json_config.sh@345 -- # break 00:06:46.361 00:01:33 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:46.361 00:01:33 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:46.361 00:01:33 json_config -- json_config/common.sh@31 -- # local app=target 00:06:46.361 00:01:33 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:46.361 00:01:33 json_config -- json_config/common.sh@35 -- # [[ -n 3449742 ]] 00:06:46.361 00:01:33 json_config -- json_config/common.sh@38 -- # kill -SIGINT 3449742 00:06:46.361 00:01:33 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:46.361 00:01:33 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:46.361 00:01:33 json_config -- json_config/common.sh@41 -- # kill -0 3449742 00:06:46.361 00:01:33 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:46.927 00:01:33 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:46.927 00:01:33 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:46.927 00:01:33 json_config -- json_config/common.sh@41 -- # kill -0 3449742 00:06:46.927 00:01:33 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:46.927 00:01:33 json_config -- json_config/common.sh@43 -- # break 00:06:46.927 00:01:33 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:46.927 00:01:33 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:46.927 SPDK target shutdown done 00:06:46.927 00:01:33 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:46.927 INFO: relaunching applications... 00:06:46.927 00:01:33 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:46.927 00:01:33 json_config -- json_config/common.sh@9 -- # local app=target 00:06:46.927 00:01:33 json_config -- json_config/common.sh@10 -- # shift 00:06:46.927 00:01:33 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:46.927 00:01:33 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:46.927 00:01:33 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:46.927 00:01:33 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:46.927 00:01:33 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:46.927 00:01:33 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=3452371 00:06:46.927 00:01:33 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:46.927 Waiting for target to run... 00:06:46.927 00:01:33 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:46.927 00:01:33 json_config -- json_config/common.sh@25 -- # waitforlisten 3452371 /var/tmp/spdk_tgt.sock 00:06:46.927 00:01:33 json_config -- common/autotest_common.sh@829 -- # '[' -z 3452371 ']' 00:06:46.927 00:01:33 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:46.927 00:01:33 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:46.927 00:01:33 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:46.927 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:46.927 00:01:33 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:46.927 00:01:33 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:46.927 [2024-07-16 00:01:33.860019] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:06:46.927 [2024-07-16 00:01:33.860099] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3452371 ] 00:06:47.863 [2024-07-16 00:01:34.515296] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.863 [2024-07-16 00:01:34.615109] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.863 [2024-07-16 00:01:34.669337] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:47.863 [2024-07-16 00:01:34.677377] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:47.863 [2024-07-16 00:01:34.685394] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:47.863 [2024-07-16 00:01:34.766612] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:50.394 [2024-07-16 00:01:36.967715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:50.394 [2024-07-16 00:01:36.967782] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:50.394 [2024-07-16 00:01:36.967797] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:50.394 [2024-07-16 00:01:36.975734] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:50.394 [2024-07-16 00:01:36.975763] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:50.394 [2024-07-16 00:01:36.983746] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:50.394 [2024-07-16 00:01:36.983770] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:50.394 [2024-07-16 00:01:36.991783] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:50.394 [2024-07-16 00:01:36.991809] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:50.394 [2024-07-16 00:01:36.991822] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:50.652 [2024-07-16 00:01:37.367917] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:50.652 [2024-07-16 00:01:37.367970] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:50.652 [2024-07-16 00:01:37.367989] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24beb40 00:06:50.652 [2024-07-16 00:01:37.368001] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:50.652 [2024-07-16 00:01:37.368296] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:50.652 [2024-07-16 00:01:37.368318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:50.652 00:01:37 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:50.652 00:01:37 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:50.652 00:01:37 json_config -- json_config/common.sh@26 -- # echo '' 00:06:50.652 00:06:50.652 00:01:37 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:06:50.652 00:01:37 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:50.652 INFO: Checking if target configuration is the same... 00:06:50.652 00:01:37 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:50.652 00:01:37 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:06:50.652 00:01:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:50.652 + '[' 2 -ne 2 ']' 00:06:50.652 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:50.652 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:50.652 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:50.652 +++ basename /dev/fd/62 00:06:50.652 ++ mktemp /tmp/62.XXX 00:06:50.652 + tmp_file_1=/tmp/62.yhg 00:06:50.652 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:50.652 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:50.652 + tmp_file_2=/tmp/spdk_tgt_config.json.B5U 00:06:50.652 + ret=0 00:06:50.652 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:51.218 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:51.218 + diff -u /tmp/62.yhg /tmp/spdk_tgt_config.json.B5U 00:06:51.218 + echo 'INFO: JSON config files are the same' 00:06:51.218 INFO: JSON config files are the same 00:06:51.218 + rm /tmp/62.yhg /tmp/spdk_tgt_config.json.B5U 00:06:51.218 + exit 0 00:06:51.218 00:01:37 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:06:51.218 00:01:37 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:06:51.218 INFO: changing configuration and checking if this can be detected... 00:06:51.218 00:01:37 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:51.219 00:01:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:51.477 00:01:38 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:51.477 00:01:38 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:06:51.477 00:01:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:51.477 + '[' 2 -ne 2 ']' 00:06:51.477 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:51.477 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:51.477 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:51.477 +++ basename /dev/fd/62 00:06:51.477 ++ mktemp /tmp/62.XXX 00:06:51.477 + tmp_file_1=/tmp/62.Q4E 00:06:51.477 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:51.477 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:51.477 + tmp_file_2=/tmp/spdk_tgt_config.json.O7n 00:06:51.477 + ret=0 00:06:51.477 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:51.735 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:51.735 + diff -u /tmp/62.Q4E /tmp/spdk_tgt_config.json.O7n 00:06:51.735 + ret=1 00:06:51.735 + echo '=== Start of file: /tmp/62.Q4E ===' 00:06:51.735 + cat /tmp/62.Q4E 00:06:51.735 + echo '=== End of file: /tmp/62.Q4E ===' 00:06:51.735 + echo '' 00:06:51.735 + echo '=== Start of file: /tmp/spdk_tgt_config.json.O7n ===' 00:06:51.735 + cat /tmp/spdk_tgt_config.json.O7n 00:06:51.735 + echo '=== End of file: /tmp/spdk_tgt_config.json.O7n ===' 00:06:51.735 + echo '' 00:06:51.735 + rm /tmp/62.Q4E /tmp/spdk_tgt_config.json.O7n 00:06:51.735 + exit 1 00:06:51.735 00:01:38 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:06:51.735 INFO: configuration change detected. 00:06:51.735 00:01:38 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:06:51.735 00:01:38 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:06:51.735 00:01:38 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:51.735 00:01:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.735 00:01:38 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:06:51.735 00:01:38 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:06:51.735 00:01:38 json_config -- json_config/json_config.sh@317 -- # [[ -n 3452371 ]] 00:06:51.735 00:01:38 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:06:51.735 00:01:38 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:06:51.735 00:01:38 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:51.735 00:01:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.735 00:01:38 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:06:51.735 00:01:38 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:06:51.735 00:01:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:06:51.994 00:01:38 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:06:51.994 00:01:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:06:52.562 00:01:39 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:06:52.562 00:01:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:06:53.129 00:01:39 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:06:53.129 00:01:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:06:53.695 00:01:40 json_config -- json_config/json_config.sh@193 -- # uname -s 00:06:53.695 00:01:40 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:06:53.695 00:01:40 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:06:53.695 00:01:40 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:06:53.695 00:01:40 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:53.695 00:01:40 json_config -- json_config/json_config.sh@323 -- # killprocess 3452371 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@948 -- # '[' -z 3452371 ']' 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@952 -- # kill -0 3452371 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@953 -- # uname 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3452371 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3452371' 00:06:53.695 killing process with pid 3452371 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@967 -- # kill 3452371 00:06:53.695 00:01:40 json_config -- common/autotest_common.sh@972 -- # wait 3452371 00:06:56.977 00:01:43 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:56.977 00:01:43 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:06:56.977 00:01:43 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:56.977 00:01:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:56.977 00:01:43 json_config -- json_config/json_config.sh@328 -- # return 0 00:06:56.977 00:01:43 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:06:56.977 INFO: Success 00:06:56.977 00:06:56.977 real 0m29.119s 00:06:56.977 user 0m35.699s 00:06:56.977 sys 0m4.313s 00:06:56.977 00:01:43 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:56.977 00:01:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:56.977 ************************************ 00:06:56.977 END TEST json_config 00:06:56.977 ************************************ 00:06:56.977 00:01:43 -- common/autotest_common.sh@1142 -- # return 0 00:06:56.977 00:01:43 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:56.977 00:01:43 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:56.977 00:01:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.977 00:01:43 -- common/autotest_common.sh@10 -- # set +x 00:06:57.269 ************************************ 00:06:57.269 START TEST json_config_extra_key 00:06:57.269 ************************************ 00:06:57.269 00:01:43 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:57.269 00:01:44 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:57.269 00:01:44 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:57.269 00:01:44 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:57.269 00:01:44 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:57.269 00:01:44 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:57.269 00:01:44 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:57.269 00:01:44 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:57.269 00:01:44 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:57.269 00:01:44 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:57.269 INFO: launching applications... 00:06:57.269 00:01:44 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=3453886 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:57.269 Waiting for target to run... 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 3453886 /var/tmp/spdk_tgt.sock 00:06:57.269 00:01:44 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 3453886 ']' 00:06:57.269 00:01:44 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:57.269 00:01:44 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:57.269 00:01:44 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:57.269 00:01:44 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:57.269 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:57.269 00:01:44 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:57.269 00:01:44 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:57.269 [2024-07-16 00:01:44.127954] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:06:57.269 [2024-07-16 00:01:44.128031] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3453886 ] 00:06:57.839 [2024-07-16 00:01:44.665572] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.839 [2024-07-16 00:01:44.773269] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.405 00:01:45 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:58.405 00:01:45 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:06:58.405 00:01:45 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:58.405 00:06:58.405 00:01:45 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:58.405 INFO: shutting down applications... 00:06:58.405 00:01:45 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:58.405 00:01:45 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:58.405 00:01:45 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:58.405 00:01:45 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 3453886 ]] 00:06:58.405 00:01:45 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 3453886 00:06:58.405 00:01:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:58.405 00:01:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:58.405 00:01:45 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 3453886 00:06:58.405 00:01:45 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:58.662 00:01:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:58.662 00:01:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:58.662 00:01:45 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 3453886 00:06:58.662 00:01:45 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:58.662 00:01:45 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:58.662 00:01:45 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:58.662 00:01:45 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:58.662 SPDK target shutdown done 00:06:58.662 00:01:45 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:58.662 Success 00:06:58.662 00:06:58.662 real 0m1.621s 00:06:58.662 user 0m1.140s 00:06:58.662 sys 0m0.680s 00:06:58.662 00:01:45 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.662 00:01:45 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:58.662 ************************************ 00:06:58.662 END TEST json_config_extra_key 00:06:58.662 ************************************ 00:06:58.662 00:01:45 -- common/autotest_common.sh@1142 -- # return 0 00:06:58.662 00:01:45 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:58.662 00:01:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:58.662 00:01:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.662 00:01:45 -- common/autotest_common.sh@10 -- # set +x 00:06:58.920 ************************************ 00:06:58.920 START TEST alias_rpc 00:06:58.920 ************************************ 00:06:58.920 00:01:45 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:58.920 * Looking for test storage... 00:06:58.920 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:06:58.920 00:01:45 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:58.920 00:01:45 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=3454115 00:06:58.920 00:01:45 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:58.920 00:01:45 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 3454115 00:06:58.920 00:01:45 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 3454115 ']' 00:06:58.920 00:01:45 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:58.920 00:01:45 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:58.920 00:01:45 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:58.920 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:58.920 00:01:45 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:58.920 00:01:45 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:58.920 [2024-07-16 00:01:45.831610] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:06:58.920 [2024-07-16 00:01:45.831689] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3454115 ] 00:06:59.178 [2024-07-16 00:01:45.963688] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.178 [2024-07-16 00:01:46.065192] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.112 00:01:46 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:00.112 00:01:46 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:00.112 00:01:46 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:00.112 00:01:47 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 3454115 00:07:00.112 00:01:47 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 3454115 ']' 00:07:00.112 00:01:47 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 3454115 00:07:00.112 00:01:47 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:07:00.112 00:01:47 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:00.112 00:01:47 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3454115 00:07:00.370 00:01:47 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:00.370 00:01:47 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:00.370 00:01:47 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3454115' 00:07:00.370 killing process with pid 3454115 00:07:00.370 00:01:47 alias_rpc -- common/autotest_common.sh@967 -- # kill 3454115 00:07:00.370 00:01:47 alias_rpc -- common/autotest_common.sh@972 -- # wait 3454115 00:07:00.629 00:07:00.629 real 0m1.815s 00:07:00.629 user 0m1.990s 00:07:00.629 sys 0m0.573s 00:07:00.629 00:01:47 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:00.629 00:01:47 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:00.629 ************************************ 00:07:00.629 END TEST alias_rpc 00:07:00.629 ************************************ 00:07:00.629 00:01:47 -- common/autotest_common.sh@1142 -- # return 0 00:07:00.629 00:01:47 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:00.629 00:01:47 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:00.629 00:01:47 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:00.629 00:01:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:00.629 00:01:47 -- common/autotest_common.sh@10 -- # set +x 00:07:00.629 ************************************ 00:07:00.629 START TEST spdkcli_tcp 00:07:00.629 ************************************ 00:07:00.629 00:01:47 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:00.888 * Looking for test storage... 00:07:00.888 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:00.888 00:01:47 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:00.888 00:01:47 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:00.888 00:01:47 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:00.888 00:01:47 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:00.888 00:01:47 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:00.888 00:01:47 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:00.888 00:01:47 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:00.888 00:01:47 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:00.888 00:01:47 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:00.888 00:01:47 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=3454357 00:07:00.888 00:01:47 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 3454357 00:07:00.888 00:01:47 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 3454357 ']' 00:07:00.888 00:01:47 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:00.888 00:01:47 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:00.888 00:01:47 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:00.888 00:01:47 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:00.888 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:00.888 00:01:47 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:00.888 00:01:47 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:00.888 [2024-07-16 00:01:47.737806] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:00.888 [2024-07-16 00:01:47.737881] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3454357 ] 00:07:01.147 [2024-07-16 00:01:47.867532] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:01.147 [2024-07-16 00:01:47.972934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:01.147 [2024-07-16 00:01:47.972937] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.084 00:01:48 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:02.084 00:01:48 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:07:02.084 00:01:48 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=3454531 00:07:02.084 00:01:48 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:02.084 00:01:48 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:02.084 [ 00:07:02.084 "bdev_malloc_delete", 00:07:02.084 "bdev_malloc_create", 00:07:02.084 "bdev_null_resize", 00:07:02.084 "bdev_null_delete", 00:07:02.084 "bdev_null_create", 00:07:02.084 "bdev_nvme_cuse_unregister", 00:07:02.084 "bdev_nvme_cuse_register", 00:07:02.084 "bdev_opal_new_user", 00:07:02.084 "bdev_opal_set_lock_state", 00:07:02.084 "bdev_opal_delete", 00:07:02.084 "bdev_opal_get_info", 00:07:02.084 "bdev_opal_create", 00:07:02.084 "bdev_nvme_opal_revert", 00:07:02.084 "bdev_nvme_opal_init", 00:07:02.084 "bdev_nvme_send_cmd", 00:07:02.084 "bdev_nvme_get_path_iostat", 00:07:02.084 "bdev_nvme_get_mdns_discovery_info", 00:07:02.084 "bdev_nvme_stop_mdns_discovery", 00:07:02.084 "bdev_nvme_start_mdns_discovery", 00:07:02.084 "bdev_nvme_set_multipath_policy", 00:07:02.084 "bdev_nvme_set_preferred_path", 00:07:02.084 "bdev_nvme_get_io_paths", 00:07:02.084 "bdev_nvme_remove_error_injection", 00:07:02.084 "bdev_nvme_add_error_injection", 00:07:02.084 "bdev_nvme_get_discovery_info", 00:07:02.084 "bdev_nvme_stop_discovery", 00:07:02.084 "bdev_nvme_start_discovery", 00:07:02.084 "bdev_nvme_get_controller_health_info", 00:07:02.084 "bdev_nvme_disable_controller", 00:07:02.084 "bdev_nvme_enable_controller", 00:07:02.084 "bdev_nvme_reset_controller", 00:07:02.084 "bdev_nvme_get_transport_statistics", 00:07:02.084 "bdev_nvme_apply_firmware", 00:07:02.084 "bdev_nvme_detach_controller", 00:07:02.084 "bdev_nvme_get_controllers", 00:07:02.084 "bdev_nvme_attach_controller", 00:07:02.084 "bdev_nvme_set_hotplug", 00:07:02.084 "bdev_nvme_set_options", 00:07:02.084 "bdev_passthru_delete", 00:07:02.084 "bdev_passthru_create", 00:07:02.084 "bdev_lvol_set_parent_bdev", 00:07:02.084 "bdev_lvol_set_parent", 00:07:02.084 "bdev_lvol_check_shallow_copy", 00:07:02.084 "bdev_lvol_start_shallow_copy", 00:07:02.084 "bdev_lvol_grow_lvstore", 00:07:02.084 "bdev_lvol_get_lvols", 00:07:02.084 "bdev_lvol_get_lvstores", 00:07:02.084 "bdev_lvol_delete", 00:07:02.084 "bdev_lvol_set_read_only", 00:07:02.084 "bdev_lvol_resize", 00:07:02.084 "bdev_lvol_decouple_parent", 00:07:02.084 "bdev_lvol_inflate", 00:07:02.084 "bdev_lvol_rename", 00:07:02.084 "bdev_lvol_clone_bdev", 00:07:02.084 "bdev_lvol_clone", 00:07:02.084 "bdev_lvol_snapshot", 00:07:02.084 "bdev_lvol_create", 00:07:02.084 "bdev_lvol_delete_lvstore", 00:07:02.084 "bdev_lvol_rename_lvstore", 00:07:02.084 "bdev_lvol_create_lvstore", 00:07:02.084 "bdev_raid_set_options", 00:07:02.084 "bdev_raid_remove_base_bdev", 00:07:02.084 "bdev_raid_add_base_bdev", 00:07:02.084 "bdev_raid_delete", 00:07:02.084 "bdev_raid_create", 00:07:02.084 "bdev_raid_get_bdevs", 00:07:02.084 "bdev_error_inject_error", 00:07:02.084 "bdev_error_delete", 00:07:02.084 "bdev_error_create", 00:07:02.084 "bdev_split_delete", 00:07:02.084 "bdev_split_create", 00:07:02.084 "bdev_delay_delete", 00:07:02.084 "bdev_delay_create", 00:07:02.084 "bdev_delay_update_latency", 00:07:02.084 "bdev_zone_block_delete", 00:07:02.084 "bdev_zone_block_create", 00:07:02.084 "blobfs_create", 00:07:02.084 "blobfs_detect", 00:07:02.084 "blobfs_set_cache_size", 00:07:02.084 "bdev_crypto_delete", 00:07:02.084 "bdev_crypto_create", 00:07:02.084 "bdev_compress_delete", 00:07:02.084 "bdev_compress_create", 00:07:02.084 "bdev_compress_get_orphans", 00:07:02.084 "bdev_aio_delete", 00:07:02.084 "bdev_aio_rescan", 00:07:02.084 "bdev_aio_create", 00:07:02.084 "bdev_ftl_set_property", 00:07:02.084 "bdev_ftl_get_properties", 00:07:02.084 "bdev_ftl_get_stats", 00:07:02.084 "bdev_ftl_unmap", 00:07:02.084 "bdev_ftl_unload", 00:07:02.084 "bdev_ftl_delete", 00:07:02.084 "bdev_ftl_load", 00:07:02.084 "bdev_ftl_create", 00:07:02.084 "bdev_virtio_attach_controller", 00:07:02.084 "bdev_virtio_scsi_get_devices", 00:07:02.084 "bdev_virtio_detach_controller", 00:07:02.084 "bdev_virtio_blk_set_hotplug", 00:07:02.084 "bdev_iscsi_delete", 00:07:02.084 "bdev_iscsi_create", 00:07:02.084 "bdev_iscsi_set_options", 00:07:02.084 "accel_error_inject_error", 00:07:02.084 "ioat_scan_accel_module", 00:07:02.084 "dsa_scan_accel_module", 00:07:02.084 "iaa_scan_accel_module", 00:07:02.084 "dpdk_cryptodev_get_driver", 00:07:02.084 "dpdk_cryptodev_set_driver", 00:07:02.084 "dpdk_cryptodev_scan_accel_module", 00:07:02.084 "compressdev_scan_accel_module", 00:07:02.084 "keyring_file_remove_key", 00:07:02.084 "keyring_file_add_key", 00:07:02.084 "keyring_linux_set_options", 00:07:02.084 "iscsi_get_histogram", 00:07:02.084 "iscsi_enable_histogram", 00:07:02.084 "iscsi_set_options", 00:07:02.084 "iscsi_get_auth_groups", 00:07:02.084 "iscsi_auth_group_remove_secret", 00:07:02.084 "iscsi_auth_group_add_secret", 00:07:02.084 "iscsi_delete_auth_group", 00:07:02.084 "iscsi_create_auth_group", 00:07:02.084 "iscsi_set_discovery_auth", 00:07:02.084 "iscsi_get_options", 00:07:02.084 "iscsi_target_node_request_logout", 00:07:02.084 "iscsi_target_node_set_redirect", 00:07:02.084 "iscsi_target_node_set_auth", 00:07:02.084 "iscsi_target_node_add_lun", 00:07:02.084 "iscsi_get_stats", 00:07:02.084 "iscsi_get_connections", 00:07:02.084 "iscsi_portal_group_set_auth", 00:07:02.084 "iscsi_start_portal_group", 00:07:02.084 "iscsi_delete_portal_group", 00:07:02.084 "iscsi_create_portal_group", 00:07:02.084 "iscsi_get_portal_groups", 00:07:02.084 "iscsi_delete_target_node", 00:07:02.084 "iscsi_target_node_remove_pg_ig_maps", 00:07:02.084 "iscsi_target_node_add_pg_ig_maps", 00:07:02.084 "iscsi_create_target_node", 00:07:02.084 "iscsi_get_target_nodes", 00:07:02.084 "iscsi_delete_initiator_group", 00:07:02.084 "iscsi_initiator_group_remove_initiators", 00:07:02.084 "iscsi_initiator_group_add_initiators", 00:07:02.084 "iscsi_create_initiator_group", 00:07:02.084 "iscsi_get_initiator_groups", 00:07:02.084 "nvmf_set_crdt", 00:07:02.084 "nvmf_set_config", 00:07:02.084 "nvmf_set_max_subsystems", 00:07:02.084 "nvmf_stop_mdns_prr", 00:07:02.084 "nvmf_publish_mdns_prr", 00:07:02.084 "nvmf_subsystem_get_listeners", 00:07:02.084 "nvmf_subsystem_get_qpairs", 00:07:02.084 "nvmf_subsystem_get_controllers", 00:07:02.084 "nvmf_get_stats", 00:07:02.084 "nvmf_get_transports", 00:07:02.084 "nvmf_create_transport", 00:07:02.084 "nvmf_get_targets", 00:07:02.084 "nvmf_delete_target", 00:07:02.084 "nvmf_create_target", 00:07:02.084 "nvmf_subsystem_allow_any_host", 00:07:02.084 "nvmf_subsystem_remove_host", 00:07:02.084 "nvmf_subsystem_add_host", 00:07:02.084 "nvmf_ns_remove_host", 00:07:02.084 "nvmf_ns_add_host", 00:07:02.084 "nvmf_subsystem_remove_ns", 00:07:02.084 "nvmf_subsystem_add_ns", 00:07:02.084 "nvmf_subsystem_listener_set_ana_state", 00:07:02.084 "nvmf_discovery_get_referrals", 00:07:02.084 "nvmf_discovery_remove_referral", 00:07:02.084 "nvmf_discovery_add_referral", 00:07:02.084 "nvmf_subsystem_remove_listener", 00:07:02.084 "nvmf_subsystem_add_listener", 00:07:02.084 "nvmf_delete_subsystem", 00:07:02.084 "nvmf_create_subsystem", 00:07:02.084 "nvmf_get_subsystems", 00:07:02.084 "env_dpdk_get_mem_stats", 00:07:02.084 "nbd_get_disks", 00:07:02.084 "nbd_stop_disk", 00:07:02.084 "nbd_start_disk", 00:07:02.084 "ublk_recover_disk", 00:07:02.084 "ublk_get_disks", 00:07:02.084 "ublk_stop_disk", 00:07:02.084 "ublk_start_disk", 00:07:02.084 "ublk_destroy_target", 00:07:02.084 "ublk_create_target", 00:07:02.084 "virtio_blk_create_transport", 00:07:02.084 "virtio_blk_get_transports", 00:07:02.084 "vhost_controller_set_coalescing", 00:07:02.084 "vhost_get_controllers", 00:07:02.084 "vhost_delete_controller", 00:07:02.084 "vhost_create_blk_controller", 00:07:02.084 "vhost_scsi_controller_remove_target", 00:07:02.084 "vhost_scsi_controller_add_target", 00:07:02.084 "vhost_start_scsi_controller", 00:07:02.084 "vhost_create_scsi_controller", 00:07:02.084 "thread_set_cpumask", 00:07:02.084 "framework_get_governor", 00:07:02.085 "framework_get_scheduler", 00:07:02.085 "framework_set_scheduler", 00:07:02.085 "framework_get_reactors", 00:07:02.085 "thread_get_io_channels", 00:07:02.085 "thread_get_pollers", 00:07:02.085 "thread_get_stats", 00:07:02.085 "framework_monitor_context_switch", 00:07:02.085 "spdk_kill_instance", 00:07:02.085 "log_enable_timestamps", 00:07:02.085 "log_get_flags", 00:07:02.085 "log_clear_flag", 00:07:02.085 "log_set_flag", 00:07:02.085 "log_get_level", 00:07:02.085 "log_set_level", 00:07:02.085 "log_get_print_level", 00:07:02.085 "log_set_print_level", 00:07:02.085 "framework_enable_cpumask_locks", 00:07:02.085 "framework_disable_cpumask_locks", 00:07:02.085 "framework_wait_init", 00:07:02.085 "framework_start_init", 00:07:02.085 "scsi_get_devices", 00:07:02.085 "bdev_get_histogram", 00:07:02.085 "bdev_enable_histogram", 00:07:02.085 "bdev_set_qos_limit", 00:07:02.085 "bdev_set_qd_sampling_period", 00:07:02.085 "bdev_get_bdevs", 00:07:02.085 "bdev_reset_iostat", 00:07:02.085 "bdev_get_iostat", 00:07:02.085 "bdev_examine", 00:07:02.085 "bdev_wait_for_examine", 00:07:02.085 "bdev_set_options", 00:07:02.085 "notify_get_notifications", 00:07:02.085 "notify_get_types", 00:07:02.085 "accel_get_stats", 00:07:02.085 "accel_set_options", 00:07:02.085 "accel_set_driver", 00:07:02.085 "accel_crypto_key_destroy", 00:07:02.085 "accel_crypto_keys_get", 00:07:02.085 "accel_crypto_key_create", 00:07:02.085 "accel_assign_opc", 00:07:02.085 "accel_get_module_info", 00:07:02.085 "accel_get_opc_assignments", 00:07:02.085 "vmd_rescan", 00:07:02.085 "vmd_remove_device", 00:07:02.085 "vmd_enable", 00:07:02.085 "sock_get_default_impl", 00:07:02.085 "sock_set_default_impl", 00:07:02.085 "sock_impl_set_options", 00:07:02.085 "sock_impl_get_options", 00:07:02.085 "iobuf_get_stats", 00:07:02.085 "iobuf_set_options", 00:07:02.085 "framework_get_pci_devices", 00:07:02.085 "framework_get_config", 00:07:02.085 "framework_get_subsystems", 00:07:02.085 "trace_get_info", 00:07:02.085 "trace_get_tpoint_group_mask", 00:07:02.085 "trace_disable_tpoint_group", 00:07:02.085 "trace_enable_tpoint_group", 00:07:02.085 "trace_clear_tpoint_mask", 00:07:02.085 "trace_set_tpoint_mask", 00:07:02.085 "keyring_get_keys", 00:07:02.085 "spdk_get_version", 00:07:02.085 "rpc_get_methods" 00:07:02.085 ] 00:07:02.085 00:01:48 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:02.085 00:01:48 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:02.085 00:01:48 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:02.085 00:01:48 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:02.085 00:01:48 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 3454357 00:07:02.085 00:01:48 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 3454357 ']' 00:07:02.085 00:01:48 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 3454357 00:07:02.085 00:01:48 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:07:02.085 00:01:48 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:02.085 00:01:48 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3454357 00:07:02.085 00:01:49 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:02.085 00:01:49 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:02.085 00:01:49 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3454357' 00:07:02.085 killing process with pid 3454357 00:07:02.085 00:01:49 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 3454357 00:07:02.085 00:01:49 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 3454357 00:07:02.653 00:07:02.653 real 0m1.854s 00:07:02.653 user 0m3.341s 00:07:02.653 sys 0m0.641s 00:07:02.653 00:01:49 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:02.653 00:01:49 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:02.653 ************************************ 00:07:02.653 END TEST spdkcli_tcp 00:07:02.653 ************************************ 00:07:02.653 00:01:49 -- common/autotest_common.sh@1142 -- # return 0 00:07:02.653 00:01:49 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:02.653 00:01:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:02.653 00:01:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:02.653 00:01:49 -- common/autotest_common.sh@10 -- # set +x 00:07:02.653 ************************************ 00:07:02.653 START TEST dpdk_mem_utility 00:07:02.653 ************************************ 00:07:02.653 00:01:49 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:02.653 * Looking for test storage... 00:07:02.653 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:02.653 00:01:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:02.653 00:01:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=3454766 00:07:02.653 00:01:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 3454766 00:07:02.653 00:01:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:02.653 00:01:49 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 3454766 ']' 00:07:02.653 00:01:49 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:02.653 00:01:49 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:02.653 00:01:49 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:02.653 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:02.653 00:01:49 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:02.653 00:01:49 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:02.912 [2024-07-16 00:01:49.665469] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:02.912 [2024-07-16 00:01:49.665541] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3454766 ] 00:07:02.912 [2024-07-16 00:01:49.795966] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.170 [2024-07-16 00:01:49.898838] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.736 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:03.736 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:07:03.736 00:01:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:03.736 00:01:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:03.736 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:03.736 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:03.736 { 00:07:03.736 "filename": "/tmp/spdk_mem_dump.txt" 00:07:03.736 } 00:07:03.737 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:03.737 00:01:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:03.998 DPDK memory size 816.000000 MiB in 2 heap(s) 00:07:03.998 2 heaps totaling size 816.000000 MiB 00:07:03.998 size: 814.000000 MiB heap id: 0 00:07:03.998 size: 2.000000 MiB heap id: 1 00:07:03.998 end heaps---------- 00:07:03.998 8 mempools totaling size 598.116089 MiB 00:07:03.998 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:03.998 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:03.998 size: 84.521057 MiB name: bdev_io_3454766 00:07:03.998 size: 51.011292 MiB name: evtpool_3454766 00:07:03.998 size: 50.003479 MiB name: msgpool_3454766 00:07:03.998 size: 21.763794 MiB name: PDU_Pool 00:07:03.998 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:03.998 size: 0.026123 MiB name: Session_Pool 00:07:03.998 end mempools------- 00:07:03.998 201 memzones totaling size 4.176453 MiB 00:07:03.998 size: 1.000366 MiB name: RG_ring_0_3454766 00:07:03.998 size: 1.000366 MiB name: RG_ring_1_3454766 00:07:03.998 size: 1.000366 MiB name: RG_ring_4_3454766 00:07:03.998 size: 1.000366 MiB name: RG_ring_5_3454766 00:07:03.998 size: 0.125366 MiB name: RG_ring_2_3454766 00:07:03.999 size: 0.015991 MiB name: RG_ring_3_3454766 00:07:03.999 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:03.999 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:03.999 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:03.999 size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:03.999 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_0 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_1 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_2 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_3 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_4 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_5 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_6 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_7 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_8 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_9 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_10 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_11 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_12 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_13 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_14 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_15 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_16 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_17 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_18 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_19 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_20 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_21 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_22 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_23 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_24 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_25 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_26 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_27 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_28 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_29 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_30 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_31 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_64 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_65 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_32 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_66 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_67 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_33 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_68 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_69 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_34 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_70 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_71 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_35 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_72 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_73 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_36 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_74 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_75 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_37 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_76 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_77 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_38 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_78 00:07:03.999 size: 0.000122 MiB name: rte_cryptodev_data_79 00:07:03.999 size: 0.000122 MiB name: rte_compressdev_data_39 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_80 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_81 00:07:04.000 size: 0.000122 MiB name: rte_compressdev_data_40 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_82 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_83 00:07:04.000 size: 0.000122 MiB name: rte_compressdev_data_41 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_84 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_85 00:07:04.000 size: 0.000122 MiB name: rte_compressdev_data_42 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_86 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_87 00:07:04.000 size: 0.000122 MiB name: rte_compressdev_data_43 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_88 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_89 00:07:04.000 size: 0.000122 MiB name: rte_compressdev_data_44 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_90 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_91 00:07:04.000 size: 0.000122 MiB name: rte_compressdev_data_45 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_92 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_93 00:07:04.000 size: 0.000122 MiB name: rte_compressdev_data_46 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_94 00:07:04.000 size: 0.000122 MiB name: rte_cryptodev_data_95 00:07:04.000 size: 0.000122 MiB name: rte_compressdev_data_47 00:07:04.000 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:04.000 end memzones------- 00:07:04.000 00:01:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:04.000 heap id: 0 total size: 814.000000 MiB number of busy elements: 524 number of free elements: 14 00:07:04.000 list of free elements. size: 11.813904 MiB 00:07:04.000 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:04.000 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:04.000 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:04.000 element at address: 0x200003e00000 with size: 0.996460 MiB 00:07:04.000 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:04.000 element at address: 0x200013800000 with size: 0.978882 MiB 00:07:04.000 element at address: 0x200007000000 with size: 0.959839 MiB 00:07:04.000 element at address: 0x200019200000 with size: 0.937256 MiB 00:07:04.000 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:07:04.000 element at address: 0x200003a00000 with size: 0.498535 MiB 00:07:04.000 element at address: 0x20000b200000 with size: 0.491272 MiB 00:07:04.000 element at address: 0x200000800000 with size: 0.486328 MiB 00:07:04.000 element at address: 0x200019400000 with size: 0.485840 MiB 00:07:04.000 element at address: 0x200027e00000 with size: 0.402527 MiB 00:07:04.000 list of standard malloc elements. size: 199.877808 MiB 00:07:04.000 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:04.000 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:04.000 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:04.000 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:04.000 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:04.000 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:04.000 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:04.000 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:04.000 element at address: 0x200000330b40 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000337640 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000033e140 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000344c40 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000034b740 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000352240 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000358d40 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000035f840 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000366880 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000036a340 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000036de00 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000375380 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000378e40 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000037c900 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000383e80 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000387940 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000038b400 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000392980 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000396440 with size: 0.004395 MiB 00:07:04.000 element at address: 0x200000399f00 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:07:04.000 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:07:04.000 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000333040 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000335540 with size: 0.004028 MiB 00:07:04.000 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000339b40 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000033c040 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000340640 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000342b40 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000347140 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000349640 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000350140 with size: 0.004028 MiB 00:07:04.000 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000354740 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000356c40 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000035b240 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000035d740 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000361d40 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000364780 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000365800 with size: 0.004028 MiB 00:07:04.000 element at address: 0x200000368240 with size: 0.004028 MiB 00:07:04.000 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:07:04.000 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000370840 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000373280 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000374300 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000376d40 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000037a800 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000037b880 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000037f340 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000381d80 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000382e00 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000385840 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000389300 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000038a380 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000038de40 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000390880 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000391900 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000394340 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000397e00 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000398e80 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000039c940 with size: 0.004028 MiB 00:07:04.001 element at address: 0x20000039f380 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:07:04.001 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:07:04.001 element at address: 0x200000204c80 with size: 0.000305 MiB 00:07:04.001 element at address: 0x200000200000 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200180 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200240 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200300 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200480 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200540 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200600 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200780 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200840 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200900 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200a80 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200b40 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200c00 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200d80 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200e40 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200f00 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201080 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201140 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201200 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201380 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201440 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201500 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201680 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201740 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201800 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201980 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201a40 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201b00 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201c80 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201d40 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201e00 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000201f80 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202040 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202100 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202280 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202340 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202400 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202580 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202640 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202700 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202880 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202940 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202a00 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202b80 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202c40 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202d00 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202e80 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000202f40 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203000 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203180 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203240 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203300 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203480 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203540 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203600 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203780 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203840 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203900 with size: 0.000183 MiB 00:07:04.001 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203a80 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203b40 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203c00 with size: 0.000183 MiB 00:07:04.001 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000203d80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000203e40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000203f00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204080 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204140 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204200 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204380 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204440 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204500 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204680 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204740 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204800 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204980 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204a40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204b00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204dc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204e80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000204f40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205000 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205180 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205240 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205300 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205480 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205540 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205600 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205780 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205840 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205900 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205a80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205b40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205c00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205d80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205e40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205f00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000206080 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000206140 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000206200 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000020a780 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022af80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b040 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b100 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b280 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b340 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b400 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b580 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b640 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b700 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b7c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022be40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022c080 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022c140 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022c200 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022c380 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022c440 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000022c500 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000032e700 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000331d40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000338840 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000033f340 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000345e40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000034c940 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000353440 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000359f40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000360a40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000364180 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000364240 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000364400 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000367a80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000367c40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000367d00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000036b540 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000036b700 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000036b980 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000036f000 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000036f280 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000036f440 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000372c80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000372d40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000372f00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000376580 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000376740 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000376800 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000037a040 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000037a200 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000037a480 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000037db00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:07:04.002 element at address: 0x20000037df40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000381780 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000381840 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000381a00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000385080 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000385240 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000385300 with size: 0.000183 MiB 00:07:04.002 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000388b40 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000388d00 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:07:04.002 element at address: 0x200000388f80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000038c600 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000038c880 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200000390280 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200000390340 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200000390500 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200000393b80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200000393d40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200000393e00 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200000397640 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200000397800 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200000397a80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000039b100 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000039b380 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000039b540 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000039f000 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000087c800 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000087c980 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:04.003 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:04.003 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e670c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e67180 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6dd80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:07:04.003 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:04.004 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:04.004 list of memzone associated elements. size: 602.308289 MiB 00:07:04.004 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:04.004 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:04.004 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:04.004 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:04.004 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:04.004 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_3454766_0 00:07:04.004 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:04.004 associated memzone info: size: 48.002930 MiB name: MP_evtpool_3454766_0 00:07:04.004 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:04.004 associated memzone info: size: 48.002930 MiB name: MP_msgpool_3454766_0 00:07:04.004 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:04.004 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:04.004 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:04.004 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:04.004 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:04.004 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_3454766 00:07:04.004 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:04.004 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_3454766 00:07:04.004 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:07:04.004 associated memzone info: size: 1.007996 MiB name: MP_evtpool_3454766 00:07:04.004 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:04.004 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:04.004 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:04.004 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:04.004 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:04.004 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:04.004 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:04.004 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:04.004 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:04.004 associated memzone info: size: 1.000366 MiB name: RG_ring_0_3454766 00:07:04.004 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:04.004 associated memzone info: size: 1.000366 MiB name: RG_ring_1_3454766 00:07:04.004 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:04.004 associated memzone info: size: 1.000366 MiB name: RG_ring_4_3454766 00:07:04.004 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:04.004 associated memzone info: size: 1.000366 MiB name: RG_ring_5_3454766 00:07:04.004 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:07:04.004 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_3454766 00:07:04.004 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:07:04.004 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:04.004 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:04.004 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:04.004 element at address: 0x20001947c600 with size: 0.250488 MiB 00:07:04.004 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:04.004 element at address: 0x20000020a840 with size: 0.125488 MiB 00:07:04.004 associated memzone info: size: 0.125366 MiB name: RG_ring_2_3454766 00:07:04.004 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:07:04.004 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:04.004 element at address: 0x200027e67240 with size: 0.023743 MiB 00:07:04.004 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:04.004 element at address: 0x200000206580 with size: 0.016113 MiB 00:07:04.004 associated memzone info: size: 0.015991 MiB name: RG_ring_3_3454766 00:07:04.004 element at address: 0x200027e6d380 with size: 0.002441 MiB 00:07:04.004 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:04.004 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:07:04.004 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:04.004 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:04.004 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:04.004 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:04.004 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:04.004 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:04.004 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:04.004 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:04.004 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:04.004 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:04.004 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:04.004 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:04.004 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:04.004 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:04.004 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:04.004 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:04.004 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:04.004 element at address: 0x20000039b700 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:04.004 element at address: 0x200000397c40 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:04.004 element at address: 0x200000394180 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:04.004 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:04.004 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:04.004 element at address: 0x200000389140 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:04.004 element at address: 0x200000385680 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:04.004 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:04.004 element at address: 0x20000037e100 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:04.004 element at address: 0x20000037a640 with size: 0.000427 MiB 00:07:04.004 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:04.005 element at address: 0x200000376b80 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:04.005 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:04.005 element at address: 0x20000036f600 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:04.005 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:04.005 element at address: 0x200000368080 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:04.005 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:04.005 element at address: 0x200000360b00 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:04.005 element at address: 0x20000035d580 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:04.005 element at address: 0x20000035a000 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:04.005 element at address: 0x200000356a80 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:04.005 element at address: 0x200000353500 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:04.005 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:04.005 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:04.005 element at address: 0x200000349480 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:04.005 element at address: 0x200000345f00 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:04.005 element at address: 0x200000342980 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:04.005 element at address: 0x20000033f400 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:04.005 element at address: 0x20000033be80 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:04.005 element at address: 0x200000338900 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:04.005 element at address: 0x200000335380 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:04.005 element at address: 0x200000331e00 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:04.005 element at address: 0x20000032e880 with size: 0.000427 MiB 00:07:04.005 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:04.005 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:07:04.005 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:04.005 element at address: 0x20000022b880 with size: 0.000305 MiB 00:07:04.005 associated memzone info: size: 0.000183 MiB name: MP_msgpool_3454766 00:07:04.005 element at address: 0x200000206380 with size: 0.000305 MiB 00:07:04.005 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_3454766 00:07:04.005 element at address: 0x200027e6de40 with size: 0.000305 MiB 00:07:04.005 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:04.005 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:04.005 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:04.005 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:07:04.005 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:04.005 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:04.005 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:07:04.005 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:04.005 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:04.005 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:07:04.005 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:04.005 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:04.005 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:07:04.005 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:04.005 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:04.005 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:07:04.005 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:04.005 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:04.005 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:07:04.005 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:04.005 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:04.005 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:07:04.005 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:04.005 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:04.005 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:07:04.005 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:04.005 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:04.005 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:07:04.005 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:04.005 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:04.005 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:07:04.005 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:04.005 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:04.005 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:07:04.005 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:04.005 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:04.005 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:07:04.005 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:04.005 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:04.005 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:07:04.005 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:04.005 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:07:04.005 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:04.005 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:07:04.006 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:04.006 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:04.006 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:07:04.006 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:04.006 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:04.006 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:07:04.006 element at address: 0x20000039b600 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:04.006 element at address: 0x20000039b440 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:04.006 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:07:04.006 element at address: 0x200000397b40 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:04.006 element at address: 0x200000397980 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:04.006 element at address: 0x200000397700 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:07:04.006 element at address: 0x200000394080 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:04.006 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:04.006 element at address: 0x200000393c40 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:07:04.006 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:04.006 element at address: 0x200000390400 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:04.006 element at address: 0x200000390180 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:07:04.006 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:04.006 element at address: 0x20000038c940 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:04.006 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:07:04.006 element at address: 0x200000389040 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:04.006 element at address: 0x200000388e80 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:04.006 element at address: 0x200000388c00 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:07:04.006 element at address: 0x200000385580 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:04.006 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:04.006 element at address: 0x200000385140 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:07:04.006 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:04.006 element at address: 0x200000381900 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:04.006 element at address: 0x200000381680 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:07:04.006 element at address: 0x20000037e000 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:04.006 element at address: 0x20000037de40 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:04.006 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:07:04.006 element at address: 0x20000037a540 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:04.006 element at address: 0x20000037a380 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:04.006 element at address: 0x20000037a100 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:07:04.006 element at address: 0x200000376a80 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:04.006 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:04.006 element at address: 0x200000376640 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:07:04.006 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:04.006 element at address: 0x200000372e00 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:04.006 element at address: 0x200000372b80 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:07:04.006 element at address: 0x20000036f500 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:04.006 element at address: 0x20000036f340 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:04.006 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:07:04.006 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:04.006 element at address: 0x20000036b880 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:04.006 element at address: 0x20000036b600 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:07:04.006 element at address: 0x200000367f80 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:04.006 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:04.006 element at address: 0x200000367b40 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:07:04.006 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:04.006 element at address: 0x200000364300 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:04.006 element at address: 0x200000364080 with size: 0.000244 MiB 00:07:04.006 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:07:04.006 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:07:04.006 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:04.006 00:01:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:04.006 00:01:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 3454766 00:07:04.006 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 3454766 ']' 00:07:04.006 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 3454766 00:07:04.006 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:07:04.006 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:04.006 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3454766 00:07:04.006 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:04.006 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:04.006 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3454766' 00:07:04.006 killing process with pid 3454766 00:07:04.006 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 3454766 00:07:04.006 00:01:50 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 3454766 00:07:04.574 00:07:04.574 real 0m1.731s 00:07:04.574 user 0m1.888s 00:07:04.574 sys 0m0.546s 00:07:04.574 00:01:51 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:04.574 00:01:51 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:04.574 ************************************ 00:07:04.574 END TEST dpdk_mem_utility 00:07:04.574 ************************************ 00:07:04.574 00:01:51 -- common/autotest_common.sh@1142 -- # return 0 00:07:04.574 00:01:51 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:04.574 00:01:51 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:04.574 00:01:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.574 00:01:51 -- common/autotest_common.sh@10 -- # set +x 00:07:04.574 ************************************ 00:07:04.574 START TEST event 00:07:04.574 ************************************ 00:07:04.574 00:01:51 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:04.574 * Looking for test storage... 00:07:04.574 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:07:04.574 00:01:51 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:04.574 00:01:51 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:04.574 00:01:51 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:04.574 00:01:51 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:04.574 00:01:51 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.574 00:01:51 event -- common/autotest_common.sh@10 -- # set +x 00:07:04.574 ************************************ 00:07:04.574 START TEST event_perf 00:07:04.574 ************************************ 00:07:04.574 00:01:51 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:04.574 Running I/O for 1 seconds...[2024-07-16 00:01:51.478380] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:04.574 [2024-07-16 00:01:51.478444] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3455007 ] 00:07:04.832 [2024-07-16 00:01:51.606410] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:04.832 [2024-07-16 00:01:51.710159] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:04.832 [2024-07-16 00:01:51.710262] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:04.832 [2024-07-16 00:01:51.710365] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:04.832 [2024-07-16 00:01:51.710366] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.208 Running I/O for 1 seconds... 00:07:06.208 lcore 0: 102781 00:07:06.208 lcore 1: 102783 00:07:06.208 lcore 2: 102786 00:07:06.208 lcore 3: 102783 00:07:06.208 done. 00:07:06.208 00:07:06.208 real 0m1.354s 00:07:06.208 user 0m4.194s 00:07:06.208 sys 0m0.147s 00:07:06.208 00:01:52 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:06.208 00:01:52 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:06.208 ************************************ 00:07:06.208 END TEST event_perf 00:07:06.208 ************************************ 00:07:06.208 00:01:52 event -- common/autotest_common.sh@1142 -- # return 0 00:07:06.208 00:01:52 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:06.208 00:01:52 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:06.208 00:01:52 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:06.208 00:01:52 event -- common/autotest_common.sh@10 -- # set +x 00:07:06.208 ************************************ 00:07:06.208 START TEST event_reactor 00:07:06.208 ************************************ 00:07:06.208 00:01:52 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:06.208 [2024-07-16 00:01:52.918289] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:06.208 [2024-07-16 00:01:52.918368] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3455204 ] 00:07:06.208 [2024-07-16 00:01:53.057967] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.465 [2024-07-16 00:01:53.162786] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.397 test_start 00:07:07.397 oneshot 00:07:07.397 tick 100 00:07:07.397 tick 100 00:07:07.397 tick 250 00:07:07.397 tick 100 00:07:07.397 tick 100 00:07:07.397 tick 100 00:07:07.397 tick 250 00:07:07.397 tick 500 00:07:07.397 tick 100 00:07:07.397 tick 100 00:07:07.397 tick 250 00:07:07.397 tick 100 00:07:07.397 tick 100 00:07:07.397 test_end 00:07:07.397 00:07:07.397 real 0m1.365s 00:07:07.397 user 0m1.207s 00:07:07.397 sys 0m0.152s 00:07:07.397 00:01:54 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:07.397 00:01:54 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:07.397 ************************************ 00:07:07.397 END TEST event_reactor 00:07:07.397 ************************************ 00:07:07.397 00:01:54 event -- common/autotest_common.sh@1142 -- # return 0 00:07:07.397 00:01:54 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:07.397 00:01:54 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:07.397 00:01:54 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:07.397 00:01:54 event -- common/autotest_common.sh@10 -- # set +x 00:07:07.397 ************************************ 00:07:07.397 START TEST event_reactor_perf 00:07:07.397 ************************************ 00:07:07.397 00:01:54 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:07.655 [2024-07-16 00:01:54.363069] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:07.655 [2024-07-16 00:01:54.363130] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3455404 ] 00:07:07.655 [2024-07-16 00:01:54.491878] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.655 [2024-07-16 00:01:54.591086] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.030 test_start 00:07:09.030 test_end 00:07:09.030 Performance: 327195 events per second 00:07:09.030 00:07:09.030 real 0m1.349s 00:07:09.031 user 0m1.208s 00:07:09.031 sys 0m0.135s 00:07:09.031 00:01:55 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:09.031 00:01:55 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:09.031 ************************************ 00:07:09.031 END TEST event_reactor_perf 00:07:09.031 ************************************ 00:07:09.031 00:01:55 event -- common/autotest_common.sh@1142 -- # return 0 00:07:09.031 00:01:55 event -- event/event.sh@49 -- # uname -s 00:07:09.031 00:01:55 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:09.031 00:01:55 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:09.031 00:01:55 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:09.031 00:01:55 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.031 00:01:55 event -- common/autotest_common.sh@10 -- # set +x 00:07:09.031 ************************************ 00:07:09.031 START TEST event_scheduler 00:07:09.031 ************************************ 00:07:09.031 00:01:55 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:09.031 * Looking for test storage... 00:07:09.031 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:07:09.031 00:01:55 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:09.031 00:01:55 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=3455672 00:07:09.031 00:01:55 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:09.031 00:01:55 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:09.031 00:01:55 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 3455672 00:07:09.031 00:01:55 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 3455672 ']' 00:07:09.031 00:01:55 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:09.031 00:01:55 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:09.031 00:01:55 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:09.031 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:09.031 00:01:55 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:09.031 00:01:55 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:09.031 [2024-07-16 00:01:55.941947] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:09.031 [2024-07-16 00:01:55.942022] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3455672 ] 00:07:09.290 [2024-07-16 00:01:56.136554] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:09.549 [2024-07-16 00:01:56.315677] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.549 [2024-07-16 00:01:56.315766] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.549 [2024-07-16 00:01:56.315867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:09.549 [2024-07-16 00:01:56.315876] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:10.117 00:01:56 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:10.117 00:01:56 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:07:10.117 00:01:56 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:10.117 00:01:56 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.117 00:01:56 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:10.117 [2024-07-16 00:01:56.895232] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:10.117 [2024-07-16 00:01:56.895289] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:07:10.117 [2024-07-16 00:01:56.895324] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:10.117 [2024-07-16 00:01:56.895350] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:10.117 [2024-07-16 00:01:56.895375] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:10.117 00:01:56 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.117 00:01:56 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:10.117 00:01:56 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.117 00:01:56 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:10.117 [2024-07-16 00:01:57.027889] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:10.117 00:01:57 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.117 00:01:57 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:10.117 00:01:57 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:10.117 00:01:57 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.117 00:01:57 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:10.376 ************************************ 00:07:10.376 START TEST scheduler_create_thread 00:07:10.376 ************************************ 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.376 2 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.376 3 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.376 4 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.376 5 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.376 6 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.376 7 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.376 8 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.376 9 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.376 10 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.376 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.942 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.942 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:10.942 00:01:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:10.942 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.942 00:01:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.876 00:01:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:11.876 00:01:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:11.876 00:01:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:11.876 00:01:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:12.845 00:01:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:12.845 00:01:59 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:12.845 00:01:59 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:12.845 00:01:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:12.845 00:01:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.411 00:02:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.411 00:07:13.411 real 0m3.231s 00:07:13.411 user 0m0.024s 00:07:13.411 sys 0m0.008s 00:07:13.411 00:02:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:13.411 00:02:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.411 ************************************ 00:07:13.411 END TEST scheduler_create_thread 00:07:13.411 ************************************ 00:07:13.411 00:02:00 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:07:13.411 00:02:00 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:13.411 00:02:00 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 3455672 00:07:13.411 00:02:00 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 3455672 ']' 00:07:13.411 00:02:00 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 3455672 00:07:13.411 00:02:00 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:07:13.411 00:02:00 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:13.411 00:02:00 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3455672 00:07:13.668 00:02:00 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:07:13.668 00:02:00 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:07:13.668 00:02:00 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3455672' 00:07:13.668 killing process with pid 3455672 00:07:13.668 00:02:00 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 3455672 00:07:13.668 00:02:00 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 3455672 00:07:13.927 [2024-07-16 00:02:00.683238] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:14.186 00:07:14.186 real 0m5.301s 00:07:14.186 user 0m10.155s 00:07:14.186 sys 0m0.617s 00:07:14.186 00:02:01 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.186 00:02:01 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:14.186 ************************************ 00:07:14.186 END TEST event_scheduler 00:07:14.186 ************************************ 00:07:14.186 00:02:01 event -- common/autotest_common.sh@1142 -- # return 0 00:07:14.186 00:02:01 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:14.186 00:02:01 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:14.186 00:02:01 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:14.186 00:02:01 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.186 00:02:01 event -- common/autotest_common.sh@10 -- # set +x 00:07:14.446 ************************************ 00:07:14.446 START TEST app_repeat 00:07:14.446 ************************************ 00:07:14.446 00:02:01 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@19 -- # repeat_pid=3456382 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 3456382' 00:07:14.446 Process app_repeat pid: 3456382 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:14.446 spdk_app_start Round 0 00:07:14.446 00:02:01 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3456382 /var/tmp/spdk-nbd.sock 00:07:14.446 00:02:01 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 3456382 ']' 00:07:14.446 00:02:01 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:14.446 00:02:01 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:14.446 00:02:01 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:14.446 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:14.446 00:02:01 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:14.446 00:02:01 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:14.446 [2024-07-16 00:02:01.213886] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:14.446 [2024-07-16 00:02:01.213961] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3456382 ] 00:07:14.446 [2024-07-16 00:02:01.330364] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:14.704 [2024-07-16 00:02:01.437157] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:14.704 [2024-07-16 00:02:01.437161] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.271 00:02:02 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:15.271 00:02:02 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:15.271 00:02:02 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:15.271 Malloc0 00:07:15.529 00:02:02 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:15.529 Malloc1 00:07:15.529 00:02:02 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.529 00:02:02 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:15.788 /dev/nbd0 00:07:15.788 00:02:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:15.788 00:02:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:15.788 1+0 records in 00:07:15.788 1+0 records out 00:07:15.788 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237829 s, 17.2 MB/s 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:15.788 00:02:02 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:15.788 00:02:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:15.788 00:02:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.788 00:02:02 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:16.047 /dev/nbd1 00:07:16.047 00:02:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:16.047 00:02:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:16.047 1+0 records in 00:07:16.047 1+0 records out 00:07:16.047 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284334 s, 14.4 MB/s 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:16.047 00:02:02 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:16.047 00:02:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:16.047 00:02:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:16.047 00:02:02 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:16.047 00:02:02 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:16.047 00:02:02 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:16.305 { 00:07:16.305 "nbd_device": "/dev/nbd0", 00:07:16.305 "bdev_name": "Malloc0" 00:07:16.305 }, 00:07:16.305 { 00:07:16.305 "nbd_device": "/dev/nbd1", 00:07:16.305 "bdev_name": "Malloc1" 00:07:16.305 } 00:07:16.305 ]' 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:16.305 { 00:07:16.305 "nbd_device": "/dev/nbd0", 00:07:16.305 "bdev_name": "Malloc0" 00:07:16.305 }, 00:07:16.305 { 00:07:16.305 "nbd_device": "/dev/nbd1", 00:07:16.305 "bdev_name": "Malloc1" 00:07:16.305 } 00:07:16.305 ]' 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:16.305 /dev/nbd1' 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:16.305 /dev/nbd1' 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:16.305 256+0 records in 00:07:16.305 256+0 records out 00:07:16.305 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106956 s, 98.0 MB/s 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:16.305 00:02:03 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:16.563 256+0 records in 00:07:16.563 256+0 records out 00:07:16.563 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0299833 s, 35.0 MB/s 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:16.563 256+0 records in 00:07:16.563 256+0 records out 00:07:16.563 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0207476 s, 50.5 MB/s 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:16.563 00:02:03 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:16.821 00:02:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:16.821 00:02:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:16.821 00:02:03 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:16.821 00:02:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:16.821 00:02:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:16.821 00:02:03 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:16.821 00:02:03 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:16.821 00:02:03 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:16.821 00:02:03 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:16.821 00:02:03 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:17.079 00:02:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:17.079 00:02:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:17.080 00:02:03 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:17.080 00:02:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:17.080 00:02:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:17.080 00:02:03 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:17.080 00:02:03 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:17.080 00:02:03 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:17.080 00:02:03 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:17.080 00:02:03 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:17.080 00:02:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:17.338 00:02:04 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:17.338 00:02:04 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:17.597 00:02:04 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:17.855 [2024-07-16 00:02:04.706450] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:17.855 [2024-07-16 00:02:04.804852] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:17.855 [2024-07-16 00:02:04.804856] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.114 [2024-07-16 00:02:04.850232] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:18.114 [2024-07-16 00:02:04.850285] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:20.641 00:02:07 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:20.641 00:02:07 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:20.641 spdk_app_start Round 1 00:07:20.641 00:02:07 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3456382 /var/tmp/spdk-nbd.sock 00:07:20.641 00:02:07 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 3456382 ']' 00:07:20.641 00:02:07 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:20.641 00:02:07 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:20.641 00:02:07 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:20.641 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:20.641 00:02:07 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:20.641 00:02:07 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:20.898 00:02:07 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:20.898 00:02:07 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:20.898 00:02:07 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:20.898 Malloc0 00:07:21.157 00:02:07 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:21.157 Malloc1 00:07:21.157 00:02:08 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:21.157 00:02:08 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:21.414 /dev/nbd0 00:07:21.414 00:02:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:21.414 00:02:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:21.414 1+0 records in 00:07:21.414 1+0 records out 00:07:21.414 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234897 s, 17.4 MB/s 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:21.414 00:02:08 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:21.694 00:02:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:21.694 00:02:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:21.694 00:02:08 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:21.694 /dev/nbd1 00:07:21.694 00:02:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:21.694 00:02:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:21.694 1+0 records in 00:07:21.694 1+0 records out 00:07:21.694 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255324 s, 16.0 MB/s 00:07:21.694 00:02:08 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:21.952 00:02:08 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:21.952 00:02:08 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:21.952 00:02:08 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:21.952 00:02:08 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:21.952 00:02:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:21.952 00:02:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:21.952 00:02:08 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:21.952 00:02:08 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:21.952 00:02:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:22.210 00:02:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:22.210 { 00:07:22.210 "nbd_device": "/dev/nbd0", 00:07:22.210 "bdev_name": "Malloc0" 00:07:22.210 }, 00:07:22.210 { 00:07:22.210 "nbd_device": "/dev/nbd1", 00:07:22.210 "bdev_name": "Malloc1" 00:07:22.211 } 00:07:22.211 ]' 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:22.211 { 00:07:22.211 "nbd_device": "/dev/nbd0", 00:07:22.211 "bdev_name": "Malloc0" 00:07:22.211 }, 00:07:22.211 { 00:07:22.211 "nbd_device": "/dev/nbd1", 00:07:22.211 "bdev_name": "Malloc1" 00:07:22.211 } 00:07:22.211 ]' 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:22.211 /dev/nbd1' 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:22.211 /dev/nbd1' 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:22.211 256+0 records in 00:07:22.211 256+0 records out 00:07:22.211 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00859249 s, 122 MB/s 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:22.211 00:02:08 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:22.211 256+0 records in 00:07:22.211 256+0 records out 00:07:22.211 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0194188 s, 54.0 MB/s 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:22.211 256+0 records in 00:07:22.211 256+0 records out 00:07:22.211 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0201211 s, 52.1 MB/s 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:22.211 00:02:09 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:22.469 00:02:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:22.469 00:02:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:22.469 00:02:09 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:22.469 00:02:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:22.469 00:02:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:22.469 00:02:09 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:22.469 00:02:09 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:22.469 00:02:09 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:22.469 00:02:09 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:22.469 00:02:09 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:22.728 00:02:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:22.986 00:02:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:22.986 00:02:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:22.986 00:02:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:22.986 00:02:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:22.986 00:02:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:22.986 00:02:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:23.245 00:02:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:23.245 00:02:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:23.245 00:02:09 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:23.245 00:02:09 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:23.245 00:02:09 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:23.245 00:02:09 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:23.245 00:02:09 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:23.245 00:02:10 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:23.503 [2024-07-16 00:02:10.392212] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:23.762 [2024-07-16 00:02:10.491811] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:23.762 [2024-07-16 00:02:10.491816] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.762 [2024-07-16 00:02:10.545425] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:23.762 [2024-07-16 00:02:10.545479] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:26.291 00:02:13 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:26.291 00:02:13 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:26.291 spdk_app_start Round 2 00:07:26.291 00:02:13 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3456382 /var/tmp/spdk-nbd.sock 00:07:26.291 00:02:13 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 3456382 ']' 00:07:26.291 00:02:13 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:26.291 00:02:13 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:26.291 00:02:13 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:26.291 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:26.291 00:02:13 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:26.292 00:02:13 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:26.549 00:02:13 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:26.549 00:02:13 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:26.549 00:02:13 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:26.807 Malloc0 00:07:26.807 00:02:13 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:26.807 Malloc1 00:07:27.066 00:02:13 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:27.066 /dev/nbd0 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:27.066 00:02:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:27.066 1+0 records in 00:07:27.066 1+0 records out 00:07:27.066 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240702 s, 17.0 MB/s 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:27.066 00:02:13 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:27.067 00:02:13 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:27.067 00:02:13 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:27.067 00:02:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:27.067 00:02:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:27.067 00:02:13 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:27.325 /dev/nbd1 00:07:27.325 00:02:14 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:27.325 00:02:14 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:27.325 00:02:14 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:27.325 00:02:14 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:27.325 00:02:14 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:27.325 00:02:14 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:27.325 00:02:14 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:27.325 00:02:14 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:27.325 00:02:14 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:27.325 00:02:14 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:27.325 00:02:14 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:27.325 1+0 records in 00:07:27.325 1+0 records out 00:07:27.325 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260913 s, 15.7 MB/s 00:07:27.584 00:02:14 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:27.584 00:02:14 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:27.584 00:02:14 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:27.584 00:02:14 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:27.584 00:02:14 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:27.584 00:02:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:27.584 00:02:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:27.584 00:02:14 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:27.584 00:02:14 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:27.584 00:02:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:27.844 { 00:07:27.844 "nbd_device": "/dev/nbd0", 00:07:27.844 "bdev_name": "Malloc0" 00:07:27.844 }, 00:07:27.844 { 00:07:27.844 "nbd_device": "/dev/nbd1", 00:07:27.844 "bdev_name": "Malloc1" 00:07:27.844 } 00:07:27.844 ]' 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:27.844 { 00:07:27.844 "nbd_device": "/dev/nbd0", 00:07:27.844 "bdev_name": "Malloc0" 00:07:27.844 }, 00:07:27.844 { 00:07:27.844 "nbd_device": "/dev/nbd1", 00:07:27.844 "bdev_name": "Malloc1" 00:07:27.844 } 00:07:27.844 ]' 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:27.844 /dev/nbd1' 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:27.844 /dev/nbd1' 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:27.844 256+0 records in 00:07:27.844 256+0 records out 00:07:27.844 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109921 s, 95.4 MB/s 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:27.844 256+0 records in 00:07:27.844 256+0 records out 00:07:27.844 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195913 s, 53.5 MB/s 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:27.844 00:02:14 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:27.844 256+0 records in 00:07:27.844 256+0 records out 00:07:27.844 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0305162 s, 34.4 MB/s 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:27.845 00:02:14 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:28.116 00:02:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:28.116 00:02:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:28.116 00:02:14 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:28.116 00:02:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:28.116 00:02:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:28.116 00:02:14 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:28.116 00:02:14 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:28.116 00:02:14 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:28.116 00:02:14 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:28.116 00:02:14 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.426 00:02:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:28.685 00:02:15 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:28.685 00:02:15 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:28.943 00:02:15 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:29.202 [2024-07-16 00:02:16.108362] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:29.461 [2024-07-16 00:02:16.207815] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:29.461 [2024-07-16 00:02:16.207820] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.461 [2024-07-16 00:02:16.260277] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:29.461 [2024-07-16 00:02:16.260326] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:31.995 00:02:18 event.app_repeat -- event/event.sh@38 -- # waitforlisten 3456382 /var/tmp/spdk-nbd.sock 00:07:31.995 00:02:18 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 3456382 ']' 00:07:31.995 00:02:18 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:31.995 00:02:18 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:31.995 00:02:18 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:31.995 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:31.995 00:02:18 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:31.995 00:02:18 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:32.253 00:02:19 event.app_repeat -- event/event.sh@39 -- # killprocess 3456382 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 3456382 ']' 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 3456382 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3456382 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3456382' 00:07:32.253 killing process with pid 3456382 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@967 -- # kill 3456382 00:07:32.253 00:02:19 event.app_repeat -- common/autotest_common.sh@972 -- # wait 3456382 00:07:32.512 spdk_app_start is called in Round 0. 00:07:32.512 Shutdown signal received, stop current app iteration 00:07:32.512 Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 reinitialization... 00:07:32.512 spdk_app_start is called in Round 1. 00:07:32.512 Shutdown signal received, stop current app iteration 00:07:32.512 Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 reinitialization... 00:07:32.512 spdk_app_start is called in Round 2. 00:07:32.512 Shutdown signal received, stop current app iteration 00:07:32.512 Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 reinitialization... 00:07:32.512 spdk_app_start is called in Round 3. 00:07:32.512 Shutdown signal received, stop current app iteration 00:07:32.512 00:02:19 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:32.512 00:02:19 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:32.512 00:07:32.512 real 0m18.238s 00:07:32.512 user 0m39.202s 00:07:32.512 sys 0m3.698s 00:07:32.512 00:02:19 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:32.512 00:02:19 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:32.512 ************************************ 00:07:32.512 END TEST app_repeat 00:07:32.512 ************************************ 00:07:32.512 00:02:19 event -- common/autotest_common.sh@1142 -- # return 0 00:07:32.512 00:02:19 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:32.512 00:07:32.512 real 0m28.149s 00:07:32.512 user 0m56.172s 00:07:32.512 sys 0m5.128s 00:07:32.512 00:02:19 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:32.512 00:02:19 event -- common/autotest_common.sh@10 -- # set +x 00:07:32.512 ************************************ 00:07:32.512 END TEST event 00:07:32.512 ************************************ 00:07:32.772 00:02:19 -- common/autotest_common.sh@1142 -- # return 0 00:07:32.772 00:02:19 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:32.772 00:02:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:32.772 00:02:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:32.772 00:02:19 -- common/autotest_common.sh@10 -- # set +x 00:07:32.772 ************************************ 00:07:32.772 START TEST thread 00:07:32.772 ************************************ 00:07:32.772 00:02:19 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:32.772 * Looking for test storage... 00:07:32.772 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:32.772 00:02:19 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:32.772 00:02:19 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:32.772 00:02:19 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:32.772 00:02:19 thread -- common/autotest_common.sh@10 -- # set +x 00:07:32.772 ************************************ 00:07:32.772 START TEST thread_poller_perf 00:07:32.772 ************************************ 00:07:32.772 00:02:19 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:32.772 [2024-07-16 00:02:19.721586] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:32.772 [2024-07-16 00:02:19.721720] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3459073 ] 00:07:33.031 [2024-07-16 00:02:19.914230] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.290 [2024-07-16 00:02:20.018791] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.290 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:34.226 ====================================== 00:07:34.226 busy:2312764148 (cyc) 00:07:34.226 total_run_count: 257000 00:07:34.226 tsc_hz: 2300000000 (cyc) 00:07:34.226 ====================================== 00:07:34.226 poller_cost: 8999 (cyc), 3912 (nsec) 00:07:34.226 00:07:34.226 real 0m1.439s 00:07:34.226 user 0m1.242s 00:07:34.226 sys 0m0.190s 00:07:34.226 00:02:21 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:34.226 00:02:21 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:34.226 ************************************ 00:07:34.226 END TEST thread_poller_perf 00:07:34.226 ************************************ 00:07:34.226 00:02:21 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:34.226 00:02:21 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:34.226 00:02:21 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:34.226 00:02:21 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.226 00:02:21 thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.485 ************************************ 00:07:34.485 START TEST thread_poller_perf 00:07:34.485 ************************************ 00:07:34.485 00:02:21 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:34.485 [2024-07-16 00:02:21.238637] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:34.485 [2024-07-16 00:02:21.238700] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3459274 ] 00:07:34.485 [2024-07-16 00:02:21.366756] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.744 [2024-07-16 00:02:21.470663] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.744 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:35.680 ====================================== 00:07:35.680 busy:2302576136 (cyc) 00:07:35.680 total_run_count: 3491000 00:07:35.680 tsc_hz: 2300000000 (cyc) 00:07:35.680 ====================================== 00:07:35.680 poller_cost: 659 (cyc), 286 (nsec) 00:07:35.680 00:07:35.680 real 0m1.357s 00:07:35.680 user 0m1.214s 00:07:35.680 sys 0m0.137s 00:07:35.680 00:02:22 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:35.680 00:02:22 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:35.680 ************************************ 00:07:35.680 END TEST thread_poller_perf 00:07:35.680 ************************************ 00:07:35.680 00:02:22 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:35.680 00:02:22 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:35.680 00:07:35.680 real 0m3.066s 00:07:35.680 user 0m2.548s 00:07:35.680 sys 0m0.524s 00:07:35.680 00:02:22 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:35.680 00:02:22 thread -- common/autotest_common.sh@10 -- # set +x 00:07:35.680 ************************************ 00:07:35.680 END TEST thread 00:07:35.680 ************************************ 00:07:35.939 00:02:22 -- common/autotest_common.sh@1142 -- # return 0 00:07:35.939 00:02:22 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:35.939 00:02:22 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:35.939 00:02:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.939 00:02:22 -- common/autotest_common.sh@10 -- # set +x 00:07:35.939 ************************************ 00:07:35.939 START TEST accel 00:07:35.939 ************************************ 00:07:35.939 00:02:22 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:35.939 * Looking for test storage... 00:07:35.939 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:35.939 00:02:22 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:35.939 00:02:22 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:35.939 00:02:22 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:35.939 00:02:22 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=3459505 00:07:35.939 00:02:22 accel -- accel/accel.sh@63 -- # waitforlisten 3459505 00:07:35.939 00:02:22 accel -- common/autotest_common.sh@829 -- # '[' -z 3459505 ']' 00:07:35.939 00:02:22 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:35.939 00:02:22 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:35.939 00:02:22 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:35.939 00:02:22 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:35.939 00:02:22 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:35.939 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:35.939 00:02:22 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:35.939 00:02:22 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:35.939 00:02:22 accel -- common/autotest_common.sh@10 -- # set +x 00:07:35.939 00:02:22 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:35.939 00:02:22 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:35.939 00:02:22 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:35.939 00:02:22 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:35.939 00:02:22 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:35.939 00:02:22 accel -- accel/accel.sh@41 -- # jq -r . 00:07:35.939 [2024-07-16 00:02:22.873829] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:35.939 [2024-07-16 00:02:22.873904] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3459505 ] 00:07:36.197 [2024-07-16 00:02:23.002728] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.197 [2024-07-16 00:02:23.111736] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@862 -- # return 0 00:07:37.131 00:02:23 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:37.131 00:02:23 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:37.131 00:02:23 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:37.131 00:02:23 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:37.131 00:02:23 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:37.131 00:02:23 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:37.131 00:02:23 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@10 -- # set +x 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # IFS== 00:07:37.131 00:02:23 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:37.131 00:02:23 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:37.131 00:02:23 accel -- accel/accel.sh@75 -- # killprocess 3459505 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@948 -- # '[' -z 3459505 ']' 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@952 -- # kill -0 3459505 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@953 -- # uname 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3459505 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3459505' 00:07:37.131 killing process with pid 3459505 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@967 -- # kill 3459505 00:07:37.131 00:02:23 accel -- common/autotest_common.sh@972 -- # wait 3459505 00:07:37.389 00:02:24 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:37.389 00:02:24 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:37.389 00:02:24 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:37.389 00:02:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:37.389 00:02:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:37.389 00:02:24 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:07:37.648 00:02:24 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:37.648 00:02:24 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:37.648 00:02:24 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:37.648 00:02:24 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:37.648 00:02:24 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:37.648 00:02:24 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:37.648 00:02:24 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:37.648 00:02:24 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:07:37.648 00:02:24 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:07:37.648 00:02:24 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:37.648 00:02:24 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:37.648 00:02:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:37.648 00:02:24 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:37.648 00:02:24 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:37.648 00:02:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:37.648 00:02:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:37.648 ************************************ 00:07:37.648 START TEST accel_missing_filename 00:07:37.648 ************************************ 00:07:37.648 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:07:37.648 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:07:37.648 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:37.648 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:37.648 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:37.648 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:37.648 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:37.648 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:37.648 00:02:24 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:37.648 00:02:24 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:37.648 00:02:24 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:37.648 00:02:24 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:37.648 00:02:24 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:37.648 00:02:24 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:37.648 00:02:24 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:37.648 00:02:24 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:07:37.648 00:02:24 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:07:37.648 [2024-07-16 00:02:24.493749] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:37.648 [2024-07-16 00:02:24.493816] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3459867 ] 00:07:37.907 [2024-07-16 00:02:24.626840] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.907 [2024-07-16 00:02:24.731596] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.907 [2024-07-16 00:02:24.800800] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:38.166 [2024-07-16 00:02:24.874919] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:07:38.166 A filename is required. 00:07:38.166 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:07:38.166 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:38.166 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:07:38.166 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:07:38.166 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:07:38.166 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:38.166 00:07:38.166 real 0m0.514s 00:07:38.166 user 0m0.342s 00:07:38.166 sys 0m0.202s 00:07:38.166 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.166 00:02:24 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:38.166 ************************************ 00:07:38.166 END TEST accel_missing_filename 00:07:38.166 ************************************ 00:07:38.166 00:02:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:38.166 00:02:25 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:38.166 00:02:25 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:38.166 00:02:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.166 00:02:25 accel -- common/autotest_common.sh@10 -- # set +x 00:07:38.166 ************************************ 00:07:38.166 START TEST accel_compress_verify 00:07:38.166 ************************************ 00:07:38.166 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:38.167 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:07:38.167 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:38.167 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:38.167 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:38.167 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:38.167 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:38.167 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:38.167 00:02:25 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:38.167 00:02:25 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:38.167 00:02:25 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:38.167 00:02:25 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:38.167 00:02:25 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:38.167 00:02:25 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:38.167 00:02:25 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:38.167 00:02:25 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:38.167 00:02:25 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:07:38.167 [2024-07-16 00:02:25.089731] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:38.167 [2024-07-16 00:02:25.089796] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3459917 ] 00:07:38.425 [2024-07-16 00:02:25.221018] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.425 [2024-07-16 00:02:25.325211] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.682 [2024-07-16 00:02:25.391861] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:38.682 [2024-07-16 00:02:25.466184] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:07:38.682 00:07:38.682 Compression does not support the verify option, aborting. 00:07:38.682 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:07:38.682 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:38.682 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:07:38.682 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:07:38.682 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:07:38.682 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:38.682 00:07:38.682 real 0m0.508s 00:07:38.682 user 0m0.337s 00:07:38.682 sys 0m0.200s 00:07:38.682 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.682 00:02:25 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:38.682 ************************************ 00:07:38.682 END TEST accel_compress_verify 00:07:38.682 ************************************ 00:07:38.682 00:02:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:38.682 00:02:25 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:38.682 00:02:25 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:38.682 00:02:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.682 00:02:25 accel -- common/autotest_common.sh@10 -- # set +x 00:07:38.941 ************************************ 00:07:38.941 START TEST accel_wrong_workload 00:07:38.941 ************************************ 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:38.941 00:02:25 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:38.941 00:02:25 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:38.941 00:02:25 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:38.941 00:02:25 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:38.941 00:02:25 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:38.941 00:02:25 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:38.941 00:02:25 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:38.941 00:02:25 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:07:38.941 00:02:25 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:07:38.941 Unsupported workload type: foobar 00:07:38.941 [2024-07-16 00:02:25.673238] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:38.941 accel_perf options: 00:07:38.941 [-h help message] 00:07:38.941 [-q queue depth per core] 00:07:38.941 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:38.941 [-T number of threads per core 00:07:38.941 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:38.941 [-t time in seconds] 00:07:38.941 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:38.941 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:38.941 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:38.941 [-l for compress/decompress workloads, name of uncompressed input file 00:07:38.941 [-S for crc32c workload, use this seed value (default 0) 00:07:38.941 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:38.941 [-f for fill workload, use this BYTE value (default 255) 00:07:38.941 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:38.941 [-y verify result if this switch is on] 00:07:38.941 [-a tasks to allocate per core (default: same value as -q)] 00:07:38.941 Can be used to spread operations across a wider range of memory. 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:38.941 00:07:38.941 real 0m0.043s 00:07:38.941 user 0m0.022s 00:07:38.941 sys 0m0.021s 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.941 00:02:25 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:38.941 ************************************ 00:07:38.941 END TEST accel_wrong_workload 00:07:38.941 ************************************ 00:07:38.941 Error: writing output failed: Broken pipe 00:07:38.941 00:02:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:38.941 00:02:25 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:38.941 00:02:25 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:38.941 00:02:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.941 00:02:25 accel -- common/autotest_common.sh@10 -- # set +x 00:07:38.941 ************************************ 00:07:38.941 START TEST accel_negative_buffers 00:07:38.941 ************************************ 00:07:38.941 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:38.941 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:07:38.941 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:38.941 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:38.941 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:38.942 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:38.942 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:38.942 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:38.942 00:02:25 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:38.942 00:02:25 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:38.942 00:02:25 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:38.942 00:02:25 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:38.942 00:02:25 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:38.942 00:02:25 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:38.942 00:02:25 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:38.942 00:02:25 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:07:38.942 00:02:25 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:07:38.942 -x option must be non-negative. 00:07:38.942 [2024-07-16 00:02:25.797328] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:38.942 accel_perf options: 00:07:38.942 [-h help message] 00:07:38.942 [-q queue depth per core] 00:07:38.942 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:38.942 [-T number of threads per core 00:07:38.942 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:38.942 [-t time in seconds] 00:07:38.942 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:38.942 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:38.942 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:38.942 [-l for compress/decompress workloads, name of uncompressed input file 00:07:38.942 [-S for crc32c workload, use this seed value (default 0) 00:07:38.942 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:38.942 [-f for fill workload, use this BYTE value (default 255) 00:07:38.942 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:38.942 [-y verify result if this switch is on] 00:07:38.942 [-a tasks to allocate per core (default: same value as -q)] 00:07:38.942 Can be used to spread operations across a wider range of memory. 00:07:38.942 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:07:38.942 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:38.942 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:38.942 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:38.942 00:07:38.942 real 0m0.044s 00:07:38.942 user 0m0.055s 00:07:38.942 sys 0m0.024s 00:07:38.942 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.942 00:02:25 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:38.942 ************************************ 00:07:38.942 END TEST accel_negative_buffers 00:07:38.942 ************************************ 00:07:38.942 00:02:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:38.942 00:02:25 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:38.942 00:02:25 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:38.942 00:02:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.942 00:02:25 accel -- common/autotest_common.sh@10 -- # set +x 00:07:38.942 ************************************ 00:07:38.942 START TEST accel_crc32c 00:07:38.942 ************************************ 00:07:38.942 00:02:25 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:38.942 00:02:25 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:38.942 00:02:25 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:38.942 00:02:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:38.942 00:02:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:38.942 00:02:25 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:39.201 00:02:25 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:39.201 00:02:25 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:39.201 00:02:25 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:39.201 00:02:25 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:39.201 00:02:25 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.201 00:02:25 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.201 00:02:25 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:39.201 00:02:25 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:39.201 00:02:25 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:39.201 [2024-07-16 00:02:25.923786] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:39.201 [2024-07-16 00:02:25.923850] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3460139 ] 00:07:39.201 [2024-07-16 00:02:26.056503] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.460 [2024-07-16 00:02:26.165538] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:39.460 00:02:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:40.835 00:02:27 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:40.835 00:07:40.835 real 0m1.524s 00:07:40.835 user 0m1.331s 00:07:40.835 sys 0m0.198s 00:07:40.835 00:02:27 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:40.835 00:02:27 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:40.835 ************************************ 00:07:40.835 END TEST accel_crc32c 00:07:40.835 ************************************ 00:07:40.835 00:02:27 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:40.835 00:02:27 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:40.835 00:02:27 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:40.835 00:02:27 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.835 00:02:27 accel -- common/autotest_common.sh@10 -- # set +x 00:07:40.835 ************************************ 00:07:40.835 START TEST accel_crc32c_C2 00:07:40.835 ************************************ 00:07:40.835 00:02:27 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:40.835 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:40.835 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:40.835 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:40.836 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:40.836 [2024-07-16 00:02:27.529762] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:40.836 [2024-07-16 00:02:27.529838] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3460341 ] 00:07:40.836 [2024-07-16 00:02:27.672824] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.836 [2024-07-16 00:02:27.777182] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.095 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.096 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:41.096 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.096 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.096 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:41.096 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:41.096 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.096 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:41.096 00:02:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:42.473 00:07:42.473 real 0m1.518s 00:07:42.473 user 0m1.332s 00:07:42.473 sys 0m0.189s 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:42.473 00:02:29 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:42.473 ************************************ 00:07:42.473 END TEST accel_crc32c_C2 00:07:42.473 ************************************ 00:07:42.473 00:02:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:42.473 00:02:29 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:42.473 00:02:29 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:42.473 00:02:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:42.473 00:02:29 accel -- common/autotest_common.sh@10 -- # set +x 00:07:42.473 ************************************ 00:07:42.473 START TEST accel_copy 00:07:42.473 ************************************ 00:07:42.473 00:02:29 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:42.473 00:02:29 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:07:42.473 [2024-07-16 00:02:29.149600] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:42.473 [2024-07-16 00:02:29.149727] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3460539 ] 00:07:42.474 [2024-07-16 00:02:29.345316] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.733 [2024-07-16 00:02:29.453561] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.733 00:02:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:44.111 00:02:30 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:44.111 00:07:44.111 real 0m1.594s 00:07:44.111 user 0m1.347s 00:07:44.111 sys 0m0.251s 00:07:44.111 00:02:30 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.111 00:02:30 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:44.111 ************************************ 00:07:44.111 END TEST accel_copy 00:07:44.111 ************************************ 00:07:44.111 00:02:30 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:44.111 00:02:30 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:44.111 00:02:30 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:44.111 00:02:30 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.111 00:02:30 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.111 ************************************ 00:07:44.111 START TEST accel_fill 00:07:44.111 ************************************ 00:07:44.111 00:02:30 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:07:44.111 00:02:30 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:07:44.111 [2024-07-16 00:02:30.812941] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:44.111 [2024-07-16 00:02:30.813005] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3460736 ] 00:07:44.111 [2024-07-16 00:02:30.940453] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.111 [2024-07-16 00:02:31.041349] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.407 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:44.408 00:02:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:45.349 00:02:32 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:45.349 00:07:45.349 real 0m1.500s 00:07:45.349 user 0m1.311s 00:07:45.349 sys 0m0.191s 00:07:45.349 00:02:32 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:45.349 00:02:32 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:45.349 ************************************ 00:07:45.349 END TEST accel_fill 00:07:45.349 ************************************ 00:07:45.608 00:02:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:45.608 00:02:32 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:45.608 00:02:32 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:45.608 00:02:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:45.608 00:02:32 accel -- common/autotest_common.sh@10 -- # set +x 00:07:45.608 ************************************ 00:07:45.608 START TEST accel_copy_crc32c 00:07:45.608 ************************************ 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:45.608 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:45.608 [2024-07-16 00:02:32.389061] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:45.608 [2024-07-16 00:02:32.389118] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3460964 ] 00:07:45.608 [2024-07-16 00:02:32.504286] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.868 [2024-07-16 00:02:32.607710] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.868 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:45.869 00:02:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:47.259 00:07:47.259 real 0m1.481s 00:07:47.259 user 0m1.306s 00:07:47.259 sys 0m0.179s 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.259 00:02:33 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:47.259 ************************************ 00:07:47.260 END TEST accel_copy_crc32c 00:07:47.260 ************************************ 00:07:47.260 00:02:33 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:47.260 00:02:33 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:47.260 00:02:33 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:47.260 00:02:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.260 00:02:33 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.260 ************************************ 00:07:47.260 START TEST accel_copy_crc32c_C2 00:07:47.260 ************************************ 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:47.260 00:02:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:47.260 [2024-07-16 00:02:33.959795] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:47.260 [2024-07-16 00:02:33.959864] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3461266 ] 00:07:47.260 [2024-07-16 00:02:34.087164] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.260 [2024-07-16 00:02:34.187952] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.519 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.520 00:02:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:48.898 00:07:48.898 real 0m1.502s 00:07:48.898 user 0m1.310s 00:07:48.898 sys 0m0.198s 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:48.898 00:02:35 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:48.898 ************************************ 00:07:48.898 END TEST accel_copy_crc32c_C2 00:07:48.898 ************************************ 00:07:48.898 00:02:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:48.898 00:02:35 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:48.898 00:02:35 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:48.898 00:02:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:48.898 00:02:35 accel -- common/autotest_common.sh@10 -- # set +x 00:07:48.898 ************************************ 00:07:48.898 START TEST accel_dualcast 00:07:48.898 ************************************ 00:07:48.898 00:02:35 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:48.898 [2024-07-16 00:02:35.541001] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:48.898 [2024-07-16 00:02:35.541066] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3461492 ] 00:07:48.898 [2024-07-16 00:02:35.668990] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.898 [2024-07-16 00:02:35.769036] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:48.898 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.899 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.899 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:48.899 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.899 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:48.899 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:48.899 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:48.899 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:48.899 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:49.158 00:02:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:50.096 00:02:37 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:50.096 00:07:50.096 real 0m1.505s 00:07:50.096 user 0m1.308s 00:07:50.096 sys 0m0.202s 00:07:50.096 00:02:37 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:50.096 00:02:37 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:50.096 ************************************ 00:07:50.096 END TEST accel_dualcast 00:07:50.096 ************************************ 00:07:50.356 00:02:37 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:50.356 00:02:37 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:50.356 00:02:37 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:50.356 00:02:37 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:50.356 00:02:37 accel -- common/autotest_common.sh@10 -- # set +x 00:07:50.356 ************************************ 00:07:50.356 START TEST accel_compare 00:07:50.356 ************************************ 00:07:50.356 00:02:37 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:50.356 00:02:37 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:50.356 [2024-07-16 00:02:37.129231] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:50.356 [2024-07-16 00:02:37.129297] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3461683 ] 00:07:50.356 [2024-07-16 00:02:37.263311] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.616 [2024-07-16 00:02:37.369176] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:50.616 00:02:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:51.992 00:02:38 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:51.992 00:07:51.992 real 0m1.522s 00:07:51.992 user 0m1.323s 00:07:51.992 sys 0m0.197s 00:07:51.992 00:02:38 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:51.992 00:02:38 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:51.992 ************************************ 00:07:51.992 END TEST accel_compare 00:07:51.992 ************************************ 00:07:51.992 00:02:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:51.992 00:02:38 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:51.992 00:02:38 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:51.992 00:02:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:51.992 00:02:38 accel -- common/autotest_common.sh@10 -- # set +x 00:07:51.992 ************************************ 00:07:51.992 START TEST accel_xor 00:07:51.992 ************************************ 00:07:51.992 00:02:38 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:51.992 00:02:38 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:51.992 [2024-07-16 00:02:38.737251] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:51.992 [2024-07-16 00:02:38.737318] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3461889 ] 00:07:51.992 [2024-07-16 00:02:38.866197] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.251 [2024-07-16 00:02:38.972433] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.251 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:52.252 00:02:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:53.629 00:07:53.629 real 0m1.506s 00:07:53.629 user 0m1.322s 00:07:53.629 sys 0m0.188s 00:07:53.629 00:02:40 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:53.629 00:02:40 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:53.629 ************************************ 00:07:53.629 END TEST accel_xor 00:07:53.629 ************************************ 00:07:53.629 00:02:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:53.629 00:02:40 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:53.629 00:02:40 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:53.629 00:02:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:53.629 00:02:40 accel -- common/autotest_common.sh@10 -- # set +x 00:07:53.629 ************************************ 00:07:53.629 START TEST accel_xor 00:07:53.629 ************************************ 00:07:53.629 00:02:40 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:53.629 00:02:40 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:53.629 [2024-07-16 00:02:40.334703] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:53.629 [2024-07-16 00:02:40.334832] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3462086 ] 00:07:53.629 [2024-07-16 00:02:40.533922] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.889 [2024-07-16 00:02:40.643396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:53.889 00:02:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:55.266 00:02:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:55.266 00:07:55.266 real 0m1.599s 00:07:55.266 user 0m1.325s 00:07:55.266 sys 0m0.277s 00:07:55.266 00:02:41 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:55.266 00:02:41 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:55.266 ************************************ 00:07:55.266 END TEST accel_xor 00:07:55.266 ************************************ 00:07:55.266 00:02:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:55.266 00:02:41 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:55.266 00:02:41 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:55.266 00:02:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:55.266 00:02:41 accel -- common/autotest_common.sh@10 -- # set +x 00:07:55.266 ************************************ 00:07:55.266 START TEST accel_dif_verify 00:07:55.266 ************************************ 00:07:55.266 00:02:41 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:55.266 00:02:41 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:55.267 [2024-07-16 00:02:41.998349] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:55.267 [2024-07-16 00:02:41.998428] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3462363 ] 00:07:55.267 [2024-07-16 00:02:42.142814] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.525 [2024-07-16 00:02:42.249058] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:55.525 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:55.526 00:02:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:56.900 00:02:43 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:56.900 00:07:56.900 real 0m1.529s 00:07:56.900 user 0m1.321s 00:07:56.900 sys 0m0.208s 00:07:56.900 00:02:43 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:56.900 00:02:43 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:07:56.900 ************************************ 00:07:56.900 END TEST accel_dif_verify 00:07:56.900 ************************************ 00:07:56.900 00:02:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:56.900 00:02:43 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:56.900 00:02:43 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:56.900 00:02:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:56.901 00:02:43 accel -- common/autotest_common.sh@10 -- # set +x 00:07:56.901 ************************************ 00:07:56.901 START TEST accel_dif_generate 00:07:56.901 ************************************ 00:07:56.901 00:02:43 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:07:56.901 00:02:43 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:07:56.901 [2024-07-16 00:02:43.619122] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:56.901 [2024-07-16 00:02:43.619187] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3462637 ] 00:07:56.901 [2024-07-16 00:02:43.750025] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.901 [2024-07-16 00:02:43.850454] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.158 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:57.159 00:02:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:58.534 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:58.535 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:58.535 00:02:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:58.535 00:02:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:58.535 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:58.535 00:02:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:58.535 00:02:45 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:58.535 00:02:45 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:58.535 00:02:45 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:58.535 00:07:58.535 real 0m1.511s 00:07:58.535 user 0m1.319s 00:07:58.535 sys 0m0.199s 00:07:58.535 00:02:45 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:58.535 00:02:45 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:07:58.535 ************************************ 00:07:58.535 END TEST accel_dif_generate 00:07:58.535 ************************************ 00:07:58.535 00:02:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:58.535 00:02:45 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:58.535 00:02:45 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:58.535 00:02:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:58.535 00:02:45 accel -- common/autotest_common.sh@10 -- # set +x 00:07:58.535 ************************************ 00:07:58.535 START TEST accel_dif_generate_copy 00:07:58.535 ************************************ 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:58.535 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:07:58.535 [2024-07-16 00:02:45.226136] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:07:58.535 [2024-07-16 00:02:45.226263] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3462833 ] 00:07:58.535 [2024-07-16 00:02:45.423269] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.795 [2024-07-16 00:02:45.525557] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.795 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:58.796 00:02:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:00.176 00:08:00.176 real 0m1.577s 00:08:00.176 user 0m1.343s 00:08:00.176 sys 0m0.238s 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.176 00:02:46 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:00.176 ************************************ 00:08:00.176 END TEST accel_dif_generate_copy 00:08:00.176 ************************************ 00:08:00.176 00:02:46 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:00.176 00:02:46 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:00.176 00:02:46 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.176 00:02:46 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:00.176 00:02:46 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:00.176 00:02:46 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.176 ************************************ 00:08:00.176 START TEST accel_comp 00:08:00.176 ************************************ 00:08:00.176 00:02:46 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:00.176 00:02:46 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:00.176 [2024-07-16 00:02:46.873521] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:00.176 [2024-07-16 00:02:46.873585] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3463034 ] 00:08:00.176 [2024-07-16 00:02:47.000241] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.176 [2024-07-16 00:02:47.099433] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.443 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:00.444 00:02:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:01.874 00:02:48 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:01.874 00:08:01.874 real 0m1.508s 00:08:01.874 user 0m1.305s 00:08:01.874 sys 0m0.200s 00:08:01.874 00:02:48 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:01.874 00:02:48 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:01.874 ************************************ 00:08:01.874 END TEST accel_comp 00:08:01.874 ************************************ 00:08:01.874 00:02:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:01.874 00:02:48 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:01.874 00:02:48 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:01.874 00:02:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:01.874 00:02:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:01.874 ************************************ 00:08:01.874 START TEST accel_decomp 00:08:01.874 ************************************ 00:08:01.874 00:02:48 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:01.874 [2024-07-16 00:02:48.461755] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:01.874 [2024-07-16 00:02:48.461818] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3463237 ] 00:08:01.874 [2024-07-16 00:02:48.589712] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.874 [2024-07-16 00:02:48.690523] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.874 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:01.875 00:02:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.264 00:02:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:03.265 00:02:49 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:03.265 00:08:03.265 real 0m1.514s 00:08:03.265 user 0m1.322s 00:08:03.265 sys 0m0.193s 00:08:03.265 00:02:49 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.265 00:02:49 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:03.265 ************************************ 00:08:03.265 END TEST accel_decomp 00:08:03.265 ************************************ 00:08:03.265 00:02:49 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.265 00:02:49 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:03.265 00:02:49 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:03.265 00:02:49 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.265 00:02:49 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.265 ************************************ 00:08:03.265 START TEST accel_decomp_full 00:08:03.265 ************************************ 00:08:03.265 00:02:50 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:03.265 00:02:50 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:03.265 [2024-07-16 00:02:50.056670] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:03.265 [2024-07-16 00:02:50.056736] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3463454 ] 00:08:03.265 [2024-07-16 00:02:50.185965] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.525 [2024-07-16 00:02:50.288848] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.525 00:02:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:04.904 00:02:51 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:04.904 00:08:04.904 real 0m1.526s 00:08:04.904 user 0m1.344s 00:08:04.904 sys 0m0.179s 00:08:04.904 00:02:51 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:04.904 00:02:51 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:04.904 ************************************ 00:08:04.904 END TEST accel_decomp_full 00:08:04.904 ************************************ 00:08:04.904 00:02:51 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:04.904 00:02:51 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:04.905 00:02:51 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:04.905 00:02:51 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:04.905 00:02:51 accel -- common/autotest_common.sh@10 -- # set +x 00:08:04.905 ************************************ 00:08:04.905 START TEST accel_decomp_mcore 00:08:04.905 ************************************ 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:04.905 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:04.905 [2024-07-16 00:02:51.667744] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:04.905 [2024-07-16 00:02:51.667809] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3463787 ] 00:08:04.905 [2024-07-16 00:02:51.798165] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:05.163 [2024-07-16 00:02:51.903379] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:05.164 [2024-07-16 00:02:51.903479] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:05.164 [2024-07-16 00:02:51.903578] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:05.164 [2024-07-16 00:02:51.903580] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.164 00:02:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:06.545 00:08:06.545 real 0m1.538s 00:08:06.545 user 0m4.808s 00:08:06.545 sys 0m0.198s 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:06.545 00:02:53 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:06.545 ************************************ 00:08:06.545 END TEST accel_decomp_mcore 00:08:06.545 ************************************ 00:08:06.545 00:02:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:06.545 00:02:53 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:06.545 00:02:53 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:06.545 00:02:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:06.545 00:02:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:06.545 ************************************ 00:08:06.545 START TEST accel_decomp_full_mcore 00:08:06.545 ************************************ 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:06.545 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:06.545 [2024-07-16 00:02:53.284179] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:06.545 [2024-07-16 00:02:53.284241] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3463987 ] 00:08:06.545 [2024-07-16 00:02:53.401971] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:06.805 [2024-07-16 00:02:53.503882] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:06.805 [2024-07-16 00:02:53.503983] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:06.805 [2024-07-16 00:02:53.504026] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:06.805 [2024-07-16 00:02:53.504027] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.805 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.805 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.805 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.805 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.805 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.805 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.805 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.805 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.806 00:02:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:08.184 00:08:08.184 real 0m1.528s 00:08:08.184 user 0m4.860s 00:08:08.184 sys 0m0.187s 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:08.184 00:02:54 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:08.184 ************************************ 00:08:08.184 END TEST accel_decomp_full_mcore 00:08:08.184 ************************************ 00:08:08.184 00:02:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:08.184 00:02:54 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:08.184 00:02:54 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:08.184 00:02:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:08.184 00:02:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:08.184 ************************************ 00:08:08.184 START TEST accel_decomp_mthread 00:08:08.184 ************************************ 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:08.184 00:02:54 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:08.184 [2024-07-16 00:02:54.895814] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:08.184 [2024-07-16 00:02:54.895877] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3464193 ] 00:08:08.184 [2024-07-16 00:02:55.027721] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.184 [2024-07-16 00:02:55.131600] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:08.444 00:02:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:09.824 00:08:09.824 real 0m1.526s 00:08:09.824 user 0m1.335s 00:08:09.824 sys 0m0.197s 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:09.824 00:02:56 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:09.824 ************************************ 00:08:09.824 END TEST accel_decomp_mthread 00:08:09.824 ************************************ 00:08:09.824 00:02:56 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:09.824 00:02:56 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:09.824 00:02:56 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:09.824 00:02:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:09.824 00:02:56 accel -- common/autotest_common.sh@10 -- # set +x 00:08:09.824 ************************************ 00:08:09.824 START TEST accel_decomp_full_mthread 00:08:09.824 ************************************ 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:09.824 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:09.824 [2024-07-16 00:02:56.507448] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:09.824 [2024-07-16 00:02:56.507514] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3464385 ] 00:08:09.824 [2024-07-16 00:02:56.637210] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.824 [2024-07-16 00:02:56.741234] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.084 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.085 00:02:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.463 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:11.464 00:08:11.464 real 0m1.558s 00:08:11.464 user 0m1.380s 00:08:11.464 sys 0m0.182s 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:11.464 00:02:58 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:11.464 ************************************ 00:08:11.464 END TEST accel_decomp_full_mthread 00:08:11.464 ************************************ 00:08:11.464 00:02:58 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:11.464 00:02:58 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:08:11.464 00:02:58 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:08:11.464 00:02:58 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:08:11.464 00:02:58 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:11.464 00:02:58 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=3464584 00:08:11.464 00:02:58 accel -- accel/accel.sh@63 -- # waitforlisten 3464584 00:08:11.464 00:02:58 accel -- common/autotest_common.sh@829 -- # '[' -z 3464584 ']' 00:08:11.464 00:02:58 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:11.464 00:02:58 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:11.464 00:02:58 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:11.464 00:02:58 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:11.464 00:02:58 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:11.464 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:11.464 00:02:58 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:11.464 00:02:58 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.464 00:02:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:11.464 00:02:58 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.464 00:02:58 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.464 00:02:58 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.464 00:02:58 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:11.464 00:02:58 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:11.464 00:02:58 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:11.464 00:02:58 accel -- accel/accel.sh@41 -- # jq -r . 00:08:11.464 [2024-07-16 00:02:58.146311] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:11.464 [2024-07-16 00:02:58.146385] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3464584 ] 00:08:11.464 [2024-07-16 00:02:58.269405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.464 [2024-07-16 00:02:58.366566] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.402 [2024-07-16 00:02:59.139660] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:12.402 00:02:59 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:12.402 00:02:59 accel -- common/autotest_common.sh@862 -- # return 0 00:08:12.402 00:02:59 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:12.402 00:02:59 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:12.402 00:02:59 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:12.402 00:02:59 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:12.402 00:02:59 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:12.402 00:02:59 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:12.402 00:02:59 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.402 00:02:59 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:12.402 00:02:59 accel -- common/autotest_common.sh@10 -- # set +x 00:08:12.402 00:02:59 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:12.660 00:02:59 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.660 "method": "compressdev_scan_accel_module", 00:08:12.660 00:02:59 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:12.660 00:02:59 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:12.660 00:02:59 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.660 00:02:59 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:12.660 00:02:59 accel -- common/autotest_common.sh@10 -- # set +x 00:08:12.660 00:02:59 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.660 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.660 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.660 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.660 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.660 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.660 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.660 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.660 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.660 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.660 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.660 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.660 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.661 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.661 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.661 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.661 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.661 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.661 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.661 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.661 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.661 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.661 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.661 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.661 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.661 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.661 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.661 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.661 00:02:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:12.661 00:02:59 accel -- accel/accel.sh@72 -- # IFS== 00:08:12.661 00:02:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:12.661 00:02:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:12.661 00:02:59 accel -- accel/accel.sh@75 -- # killprocess 3464584 00:08:12.661 00:02:59 accel -- common/autotest_common.sh@948 -- # '[' -z 3464584 ']' 00:08:12.661 00:02:59 accel -- common/autotest_common.sh@952 -- # kill -0 3464584 00:08:12.661 00:02:59 accel -- common/autotest_common.sh@953 -- # uname 00:08:12.661 00:02:59 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:12.661 00:02:59 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3464584 00:08:12.920 00:02:59 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:12.920 00:02:59 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:12.920 00:02:59 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3464584' 00:08:12.920 killing process with pid 3464584 00:08:12.920 00:02:59 accel -- common/autotest_common.sh@967 -- # kill 3464584 00:08:12.920 00:02:59 accel -- common/autotest_common.sh@972 -- # wait 3464584 00:08:13.179 00:02:59 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:13.179 00:03:00 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.179 00:03:00 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:13.179 00:03:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.179 00:03:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.179 ************************************ 00:08:13.179 START TEST accel_cdev_comp 00:08:13.179 ************************************ 00:08:13.179 00:03:00 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:13.179 00:03:00 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:13.179 [2024-07-16 00:03:00.072639] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:13.179 [2024-07-16 00:03:00.072689] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3464959 ] 00:08:13.438 [2024-07-16 00:03:00.186172] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.438 [2024-07-16 00:03:00.290782] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.375 [2024-07-16 00:03:01.063448] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:14.375 [2024-07-16 00:03:01.066061] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26be080 PMD being used: compress_qat 00:08:14.375 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.375 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.375 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.375 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.375 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.375 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 [2024-07-16 00:03:01.070166] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26c2e60 PMD being used: compress_qat 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.376 00:03:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:15.313 00:03:02 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:15.313 00:08:15.313 real 0m2.212s 00:08:15.313 user 0m1.627s 00:08:15.313 sys 0m0.586s 00:08:15.313 00:03:02 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:15.313 00:03:02 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:15.313 ************************************ 00:08:15.313 END TEST accel_cdev_comp 00:08:15.313 ************************************ 00:08:15.573 00:03:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:15.573 00:03:02 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:15.573 00:03:02 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:15.573 00:03:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:15.573 00:03:02 accel -- common/autotest_common.sh@10 -- # set +x 00:08:15.573 ************************************ 00:08:15.573 START TEST accel_cdev_decomp 00:08:15.573 ************************************ 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:15.573 00:03:02 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:15.573 [2024-07-16 00:03:02.370169] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:15.573 [2024-07-16 00:03:02.370238] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3465356 ] 00:08:15.573 [2024-07-16 00:03:02.487631] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.832 [2024-07-16 00:03:02.598158] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.768 [2024-07-16 00:03:03.357134] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:16.768 [2024-07-16 00:03:03.359793] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13b3080 PMD being used: compress_qat 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 [2024-07-16 00:03:03.364074] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13b7e60 PMD being used: compress_qat 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.768 00:03:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.701 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:17.702 00:08:17.702 real 0m2.214s 00:08:17.702 user 0m1.611s 00:08:17.702 sys 0m0.606s 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:17.702 00:03:04 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:17.702 ************************************ 00:08:17.702 END TEST accel_cdev_decomp 00:08:17.702 ************************************ 00:08:17.702 00:03:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:17.702 00:03:04 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.702 00:03:04 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:17.702 00:03:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:17.702 00:03:04 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.702 ************************************ 00:08:17.702 START TEST accel_cdev_decomp_full 00:08:17.702 ************************************ 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:17.702 00:03:04 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:17.961 [2024-07-16 00:03:04.669295] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:17.961 [2024-07-16 00:03:04.669367] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3465736 ] 00:08:17.961 [2024-07-16 00:03:04.798905] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.961 [2024-07-16 00:03:04.902176] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.964 [2024-07-16 00:03:05.671454] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:18.964 [2024-07-16 00:03:05.674100] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xbf3080 PMD being used: compress_qat 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.964 [2024-07-16 00:03:05.677429] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xbf2ce0 PMD being used: compress_qat 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.964 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.965 00:03:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:19.901 00:08:19.901 real 0m2.212s 00:08:19.901 user 0m1.619s 00:08:19.901 sys 0m0.588s 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:19.901 00:03:06 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:19.901 ************************************ 00:08:19.901 END TEST accel_cdev_decomp_full 00:08:19.901 ************************************ 00:08:20.159 00:03:06 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:20.160 00:03:06 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:20.160 00:03:06 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:20.160 00:03:06 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.160 00:03:06 accel -- common/autotest_common.sh@10 -- # set +x 00:08:20.160 ************************************ 00:08:20.160 START TEST accel_cdev_decomp_mcore 00:08:20.160 ************************************ 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:20.160 00:03:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:20.160 [2024-07-16 00:03:06.967562] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:20.160 [2024-07-16 00:03:06.967627] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3466391 ] 00:08:20.160 [2024-07-16 00:03:07.097636] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:20.418 [2024-07-16 00:03:07.202838] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:20.418 [2024-07-16 00:03:07.202972] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:20.418 [2024-07-16 00:03:07.203022] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.418 [2024-07-16 00:03:07.203021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:21.352 [2024-07-16 00:03:07.968754] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:21.352 [2024-07-16 00:03:07.971309] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12d8720 PMD being used: compress_qat 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:21.352 [2024-07-16 00:03:07.977058] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8ddc19b8b0 PMD being used: compress_qat 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.352 [2024-07-16 00:03:07.978599] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12dd9f0 PMD being used: compress_qat 00:08:21.352 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 [2024-07-16 00:03:07.981581] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8dd419b8b0 PMD being used: compress_qat 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.353 [2024-07-16 00:03:07.981936] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8dcc19b8b0 PMD being used: compress_qat 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.353 00:03:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:22.284 00:08:22.284 real 0m2.249s 00:08:22.284 user 0m7.233s 00:08:22.284 sys 0m0.616s 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:22.284 00:03:09 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:22.284 ************************************ 00:08:22.284 END TEST accel_cdev_decomp_mcore 00:08:22.284 ************************************ 00:08:22.284 00:03:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:22.284 00:03:09 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:22.284 00:03:09 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:22.284 00:03:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:22.284 00:03:09 accel -- common/autotest_common.sh@10 -- # set +x 00:08:22.542 ************************************ 00:08:22.542 START TEST accel_cdev_decomp_full_mcore 00:08:22.542 ************************************ 00:08:22.542 00:03:09 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:22.542 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:22.542 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:22.543 00:03:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:22.543 [2024-07-16 00:03:09.297672] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:22.543 [2024-07-16 00:03:09.297736] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3466756 ] 00:08:22.543 [2024-07-16 00:03:09.426328] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:22.801 [2024-07-16 00:03:09.528569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:22.801 [2024-07-16 00:03:09.528671] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:22.801 [2024-07-16 00:03:09.528773] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:22.801 [2024-07-16 00:03:09.528774] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.368 [2024-07-16 00:03:10.293394] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:23.369 [2024-07-16 00:03:10.295979] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17e3720 PMD being used: compress_qat 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:23.369 [2024-07-16 00:03:10.301182] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f150c19b8b0 PMD being used: compress_qat 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.369 [2024-07-16 00:03:10.303108] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17e6a30 PMD being used: compress_qat 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 [2024-07-16 00:03:10.305743] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f150419b8b0 PMD being used: compress_qat 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.369 [2024-07-16 00:03:10.306167] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f14fc19b8b0 PMD being used: compress_qat 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.369 00:03:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:24.771 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.772 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.772 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.772 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:24.772 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:24.772 00:03:11 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:24.772 00:08:24.772 real 0m2.244s 00:08:24.772 user 0m7.236s 00:08:24.772 sys 0m0.621s 00:08:24.772 00:03:11 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:24.772 00:03:11 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:24.772 ************************************ 00:08:24.772 END TEST accel_cdev_decomp_full_mcore 00:08:24.772 ************************************ 00:08:24.772 00:03:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:24.772 00:03:11 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:24.772 00:03:11 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:24.772 00:03:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:24.772 00:03:11 accel -- common/autotest_common.sh@10 -- # set +x 00:08:24.772 ************************************ 00:08:24.772 START TEST accel_cdev_decomp_mthread 00:08:24.772 ************************************ 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:24.772 00:03:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:24.772 [2024-07-16 00:03:11.626388] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:24.772 [2024-07-16 00:03:11.626456] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3466963 ] 00:08:25.031 [2024-07-16 00:03:11.746759] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.031 [2024-07-16 00:03:11.848177] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.967 [2024-07-16 00:03:12.615721] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:25.967 [2024-07-16 00:03:12.618351] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15cb080 PMD being used: compress_qat 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:25.967 [2024-07-16 00:03:12.623407] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15d02a0 PMD being used: compress_qat 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 [2024-07-16 00:03:12.625915] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16f30f0 PMD being used: compress_qat 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.967 00:03:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:26.902 00:08:26.902 real 0m2.217s 00:08:26.902 user 0m1.655s 00:08:26.902 sys 0m0.566s 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:26.902 00:03:13 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:26.902 ************************************ 00:08:26.902 END TEST accel_cdev_decomp_mthread 00:08:26.902 ************************************ 00:08:27.161 00:03:13 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:27.161 00:03:13 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:27.161 00:03:13 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:27.161 00:03:13 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:27.161 00:03:13 accel -- common/autotest_common.sh@10 -- # set +x 00:08:27.161 ************************************ 00:08:27.161 START TEST accel_cdev_decomp_full_mthread 00:08:27.161 ************************************ 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:27.161 00:03:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:27.161 [2024-07-16 00:03:13.926845] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:27.161 [2024-07-16 00:03:13.926908] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3467325 ] 00:08:27.161 [2024-07-16 00:03:14.055804] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.420 [2024-07-16 00:03:14.157175] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.988 [2024-07-16 00:03:14.918131] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:27.988 [2024-07-16 00:03:14.920727] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1eba080 PMD being used: compress_qat 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:27.988 [2024-07-16 00:03:14.924948] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ebd3b0 PMD being used: compress_qat 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:27.988 [2024-07-16 00:03:14.927838] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fe1cc0 PMD being used: compress_qat 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.988 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.247 00:03:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:29.183 00:08:29.183 real 0m2.203s 00:08:29.183 user 0m1.605s 00:08:29.183 sys 0m0.602s 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:29.183 00:03:16 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:29.183 ************************************ 00:08:29.183 END TEST accel_cdev_decomp_full_mthread 00:08:29.183 ************************************ 00:08:29.442 00:03:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:29.442 00:03:16 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:29.442 00:03:16 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:29.442 00:03:16 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:29.442 00:03:16 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:29.442 00:03:16 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:29.442 00:03:16 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:29.442 00:03:16 accel -- common/autotest_common.sh@10 -- # set +x 00:08:29.442 00:03:16 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:29.442 00:03:16 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:29.442 00:03:16 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:29.442 00:03:16 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:29.442 00:03:16 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:29.442 00:03:16 accel -- accel/accel.sh@41 -- # jq -r . 00:08:29.442 ************************************ 00:08:29.442 START TEST accel_dif_functional_tests 00:08:29.442 ************************************ 00:08:29.442 00:03:16 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:29.442 [2024-07-16 00:03:16.211951] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:29.442 [2024-07-16 00:03:16.211993] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3467696 ] 00:08:29.442 [2024-07-16 00:03:16.326116] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:29.701 [2024-07-16 00:03:16.431686] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:29.701 [2024-07-16 00:03:16.431787] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:29.701 [2024-07-16 00:03:16.431790] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.701 00:08:29.701 00:08:29.701 CUnit - A unit testing framework for C - Version 2.1-3 00:08:29.701 http://cunit.sourceforge.net/ 00:08:29.701 00:08:29.701 00:08:29.701 Suite: accel_dif 00:08:29.701 Test: verify: DIF generated, GUARD check ...passed 00:08:29.701 Test: verify: DIF generated, APPTAG check ...passed 00:08:29.701 Test: verify: DIF generated, REFTAG check ...passed 00:08:29.701 Test: verify: DIF not generated, GUARD check ...[2024-07-16 00:03:16.542846] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:29.701 passed 00:08:29.701 Test: verify: DIF not generated, APPTAG check ...[2024-07-16 00:03:16.542934] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:29.701 passed 00:08:29.701 Test: verify: DIF not generated, REFTAG check ...[2024-07-16 00:03:16.542978] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:29.701 passed 00:08:29.701 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:29.701 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-16 00:03:16.543054] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:29.701 passed 00:08:29.701 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:29.701 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:29.701 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:29.701 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-16 00:03:16.543223] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:29.701 passed 00:08:29.701 Test: verify copy: DIF generated, GUARD check ...passed 00:08:29.701 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:29.701 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:29.701 Test: verify copy: DIF not generated, GUARD check ...[2024-07-16 00:03:16.543415] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:29.701 passed 00:08:29.701 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-16 00:03:16.543466] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:29.701 passed 00:08:29.701 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-16 00:03:16.543505] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:29.701 passed 00:08:29.701 Test: generate copy: DIF generated, GUARD check ...passed 00:08:29.701 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:29.701 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:29.701 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:29.701 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:29.701 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:29.701 Test: generate copy: iovecs-len validate ...[2024-07-16 00:03:16.543784] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:29.701 passed 00:08:29.701 Test: generate copy: buffer alignment validate ...passed 00:08:29.701 00:08:29.701 Run Summary: Type Total Ran Passed Failed Inactive 00:08:29.701 suites 1 1 n/a 0 0 00:08:29.701 tests 26 26 26 0 0 00:08:29.701 asserts 115 115 115 0 n/a 00:08:29.701 00:08:29.701 Elapsed time = 0.003 seconds 00:08:29.960 00:08:29.960 real 0m0.590s 00:08:29.960 user 0m0.790s 00:08:29.960 sys 0m0.217s 00:08:29.960 00:03:16 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:29.960 00:03:16 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:29.960 ************************************ 00:08:29.960 END TEST accel_dif_functional_tests 00:08:29.960 ************************************ 00:08:29.960 00:03:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:29.960 00:08:29.960 real 0m54.118s 00:08:29.960 user 1m2.147s 00:08:29.960 sys 0m12.239s 00:08:29.960 00:03:16 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:29.960 00:03:16 accel -- common/autotest_common.sh@10 -- # set +x 00:08:29.960 ************************************ 00:08:29.960 END TEST accel 00:08:29.960 ************************************ 00:08:29.960 00:03:16 -- common/autotest_common.sh@1142 -- # return 0 00:08:29.960 00:03:16 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:29.960 00:03:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:29.960 00:03:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:29.960 00:03:16 -- common/autotest_common.sh@10 -- # set +x 00:08:29.960 ************************************ 00:08:29.960 START TEST accel_rpc 00:08:29.960 ************************************ 00:08:29.960 00:03:16 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:30.219 * Looking for test storage... 00:08:30.219 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:30.219 00:03:17 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:30.219 00:03:17 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=3467760 00:08:30.219 00:03:17 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 3467760 00:08:30.219 00:03:17 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:30.219 00:03:17 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 3467760 ']' 00:08:30.219 00:03:17 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:30.219 00:03:17 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:30.219 00:03:17 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:30.219 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:30.219 00:03:17 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:30.219 00:03:17 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:30.219 [2024-07-16 00:03:17.062450] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:30.219 [2024-07-16 00:03:17.062506] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3467760 ] 00:08:30.478 [2024-07-16 00:03:17.174954] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.478 [2024-07-16 00:03:17.277457] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.046 00:03:17 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:31.046 00:03:17 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:31.046 00:03:17 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:31.046 00:03:17 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:31.046 00:03:17 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:31.046 00:03:17 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:31.046 00:03:17 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:31.046 00:03:17 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:31.046 00:03:17 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.046 00:03:17 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:31.046 ************************************ 00:08:31.046 START TEST accel_assign_opcode 00:08:31.046 ************************************ 00:08:31.046 00:03:17 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:31.046 00:03:17 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:31.046 00:03:17 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.046 00:03:17 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:31.046 [2024-07-16 00:03:17.975677] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:31.046 00:03:17 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.046 00:03:17 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:31.046 00:03:17 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.046 00:03:17 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:31.046 [2024-07-16 00:03:17.987705] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:31.046 00:03:17 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.046 00:03:17 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:31.047 00:03:17 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.047 00:03:17 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:31.329 00:03:18 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.329 00:03:18 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:31.329 00:03:18 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.329 00:03:18 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:31.329 00:03:18 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:31.329 00:03:18 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:31.329 00:03:18 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.329 software 00:08:31.329 00:08:31.329 real 0m0.302s 00:08:31.329 user 0m0.051s 00:08:31.329 sys 0m0.014s 00:08:31.329 00:03:18 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.329 00:03:18 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:31.329 ************************************ 00:08:31.329 END TEST accel_assign_opcode 00:08:31.329 ************************************ 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:31.589 00:03:18 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 3467760 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 3467760 ']' 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 3467760 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3467760 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3467760' 00:08:31.589 killing process with pid 3467760 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@967 -- # kill 3467760 00:08:31.589 00:03:18 accel_rpc -- common/autotest_common.sh@972 -- # wait 3467760 00:08:31.848 00:08:31.848 real 0m1.839s 00:08:31.848 user 0m1.855s 00:08:31.848 sys 0m0.587s 00:08:31.848 00:03:18 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.848 00:03:18 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:31.848 ************************************ 00:08:31.848 END TEST accel_rpc 00:08:31.848 ************************************ 00:08:31.848 00:03:18 -- common/autotest_common.sh@1142 -- # return 0 00:08:31.848 00:03:18 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:31.848 00:03:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:31.848 00:03:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.848 00:03:18 -- common/autotest_common.sh@10 -- # set +x 00:08:32.107 ************************************ 00:08:32.107 START TEST app_cmdline 00:08:32.107 ************************************ 00:08:32.107 00:03:18 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:32.107 * Looking for test storage... 00:08:32.107 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:32.107 00:03:18 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:32.107 00:03:18 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=3468179 00:08:32.107 00:03:18 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 3468179 00:08:32.107 00:03:18 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:32.108 00:03:18 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 3468179 ']' 00:08:32.108 00:03:18 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:32.108 00:03:18 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:32.108 00:03:18 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:32.108 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:32.108 00:03:18 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:32.108 00:03:18 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:32.108 [2024-07-16 00:03:18.982382] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:32.108 [2024-07-16 00:03:18.982441] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3468179 ] 00:08:32.366 [2024-07-16 00:03:19.094280] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.366 [2024-07-16 00:03:19.192403] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.933 00:03:19 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:32.933 00:03:19 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:32.933 00:03:19 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:33.192 { 00:08:33.192 "version": "SPDK v24.09-pre git sha1 406b3b1b5", 00:08:33.192 "fields": { 00:08:33.192 "major": 24, 00:08:33.192 "minor": 9, 00:08:33.192 "patch": 0, 00:08:33.192 "suffix": "-pre", 00:08:33.192 "commit": "406b3b1b5" 00:08:33.192 } 00:08:33.192 } 00:08:33.192 00:03:20 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:33.192 00:03:20 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:33.192 00:03:20 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:33.192 00:03:20 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:33.192 00:03:20 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:33.192 00:03:20 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:33.192 00:03:20 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:33.192 00:03:20 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:33.192 00:03:20 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:33.192 00:03:20 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.192 00:03:20 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:33.192 00:03:20 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:33.192 00:03:20 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:33.192 00:03:20 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:33.192 00:03:20 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:33.192 00:03:20 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:33.452 request: 00:08:33.452 { 00:08:33.452 "method": "env_dpdk_get_mem_stats", 00:08:33.452 "req_id": 1 00:08:33.452 } 00:08:33.452 Got JSON-RPC error response 00:08:33.452 response: 00:08:33.452 { 00:08:33.452 "code": -32601, 00:08:33.452 "message": "Method not found" 00:08:33.452 } 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:33.452 00:03:20 app_cmdline -- app/cmdline.sh@1 -- # killprocess 3468179 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 3468179 ']' 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 3468179 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3468179 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3468179' 00:08:33.452 killing process with pid 3468179 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@967 -- # kill 3468179 00:08:33.452 00:03:20 app_cmdline -- common/autotest_common.sh@972 -- # wait 3468179 00:08:34.054 00:08:34.054 real 0m1.936s 00:08:34.054 user 0m2.254s 00:08:34.054 sys 0m0.604s 00:08:34.054 00:03:20 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.054 00:03:20 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:34.054 ************************************ 00:08:34.054 END TEST app_cmdline 00:08:34.054 ************************************ 00:08:34.054 00:03:20 -- common/autotest_common.sh@1142 -- # return 0 00:08:34.054 00:03:20 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:34.054 00:03:20 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:34.054 00:03:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.054 00:03:20 -- common/autotest_common.sh@10 -- # set +x 00:08:34.054 ************************************ 00:08:34.055 START TEST version 00:08:34.055 ************************************ 00:08:34.055 00:03:20 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:34.055 * Looking for test storage... 00:08:34.055 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:34.055 00:03:20 version -- app/version.sh@17 -- # get_header_version major 00:08:34.055 00:03:20 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:34.055 00:03:20 version -- app/version.sh@14 -- # cut -f2 00:08:34.055 00:03:20 version -- app/version.sh@14 -- # tr -d '"' 00:08:34.055 00:03:20 version -- app/version.sh@17 -- # major=24 00:08:34.055 00:03:20 version -- app/version.sh@18 -- # get_header_version minor 00:08:34.055 00:03:20 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:34.055 00:03:20 version -- app/version.sh@14 -- # cut -f2 00:08:34.055 00:03:20 version -- app/version.sh@14 -- # tr -d '"' 00:08:34.055 00:03:20 version -- app/version.sh@18 -- # minor=9 00:08:34.055 00:03:20 version -- app/version.sh@19 -- # get_header_version patch 00:08:34.055 00:03:20 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:34.055 00:03:20 version -- app/version.sh@14 -- # cut -f2 00:08:34.055 00:03:20 version -- app/version.sh@14 -- # tr -d '"' 00:08:34.055 00:03:20 version -- app/version.sh@19 -- # patch=0 00:08:34.055 00:03:20 version -- app/version.sh@20 -- # get_header_version suffix 00:08:34.055 00:03:20 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:34.055 00:03:20 version -- app/version.sh@14 -- # cut -f2 00:08:34.055 00:03:20 version -- app/version.sh@14 -- # tr -d '"' 00:08:34.055 00:03:20 version -- app/version.sh@20 -- # suffix=-pre 00:08:34.055 00:03:20 version -- app/version.sh@22 -- # version=24.9 00:08:34.055 00:03:20 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:34.055 00:03:20 version -- app/version.sh@28 -- # version=24.9rc0 00:08:34.055 00:03:20 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:34.055 00:03:20 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:34.315 00:03:21 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:34.315 00:03:21 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:34.315 00:08:34.315 real 0m0.194s 00:08:34.315 user 0m0.103s 00:08:34.315 sys 0m0.140s 00:08:34.315 00:03:21 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.315 00:03:21 version -- common/autotest_common.sh@10 -- # set +x 00:08:34.315 ************************************ 00:08:34.315 END TEST version 00:08:34.315 ************************************ 00:08:34.315 00:03:21 -- common/autotest_common.sh@1142 -- # return 0 00:08:34.315 00:03:21 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:34.315 00:03:21 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:34.315 00:03:21 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:34.315 00:03:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.315 00:03:21 -- common/autotest_common.sh@10 -- # set +x 00:08:34.315 ************************************ 00:08:34.315 START TEST blockdev_general 00:08:34.315 ************************************ 00:08:34.315 00:03:21 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:34.315 * Looking for test storage... 00:08:34.315 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:34.315 00:03:21 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3468576 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 3468576 00:08:34.315 00:03:21 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:34.315 00:03:21 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 3468576 ']' 00:08:34.315 00:03:21 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:34.315 00:03:21 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:34.315 00:03:21 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:34.315 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:34.315 00:03:21 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:34.315 00:03:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:34.575 [2024-07-16 00:03:21.384259] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:34.575 [2024-07-16 00:03:21.384404] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3468576 ] 00:08:34.835 [2024-07-16 00:03:21.584077] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.835 [2024-07-16 00:03:21.691217] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.835 00:03:21 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:34.835 00:03:21 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:08:34.835 00:03:21 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:08:34.835 00:03:21 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:08:34.835 00:03:21 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:08:34.835 00:03:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:34.835 00:03:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:35.094 [2024-07-16 00:03:22.040093] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:35.094 [2024-07-16 00:03:22.040147] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:35.094 00:08:35.354 [2024-07-16 00:03:22.048079] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:35.354 [2024-07-16 00:03:22.048106] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:35.354 00:08:35.354 Malloc0 00:08:35.354 Malloc1 00:08:35.354 Malloc2 00:08:35.354 Malloc3 00:08:35.354 Malloc4 00:08:35.354 Malloc5 00:08:35.354 Malloc6 00:08:35.354 Malloc7 00:08:35.354 Malloc8 00:08:35.354 Malloc9 00:08:35.354 [2024-07-16 00:03:22.196429] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:35.354 [2024-07-16 00:03:22.196476] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:35.354 [2024-07-16 00:03:22.196497] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0c350 00:08:35.354 [2024-07-16 00:03:22.196516] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:35.354 [2024-07-16 00:03:22.197853] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:35.354 [2024-07-16 00:03:22.197881] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:35.354 TestPT 00:08:35.354 00:03:22 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:35.354 00:03:22 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:08:35.354 5000+0 records in 00:08:35.354 5000+0 records out 00:08:35.354 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0250348 s, 409 MB/s 00:08:35.354 00:03:22 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:08:35.354 00:03:22 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:35.354 00:03:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:35.354 AIO0 00:08:35.354 00:03:22 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:35.354 00:03:22 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:08:35.354 00:03:22 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:35.354 00:03:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:35.612 00:03:22 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:35.612 00:03:22 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:08:35.612 00:03:22 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:08:35.612 00:03:22 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:35.613 00:03:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:35.613 00:03:22 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:35.613 00:03:22 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:08:35.613 00:03:22 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:35.613 00:03:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:35.613 00:03:22 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:35.613 00:03:22 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:08:35.613 00:03:22 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:35.613 00:03:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:35.613 00:03:22 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:35.613 00:03:22 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:08:35.613 00:03:22 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:08:35.613 00:03:22 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:08:35.613 00:03:22 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:35.613 00:03:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:35.872 00:03:22 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:35.872 00:03:22 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:08:35.872 00:03:22 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:08:35.874 00:03:22 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b7064d7a-6d80-4433-b407-3c52bb1af1d3"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b7064d7a-6d80-4433-b407-3c52bb1af1d3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "20a694cb-5588-5c28-80c4-88022e3f7f16"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "20a694cb-5588-5c28-80c4-88022e3f7f16",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "ce7071ea-2df7-5648-995e-888a75e33ef2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ce7071ea-2df7-5648-995e-888a75e33ef2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "1f8dc7e7-e82f-58b2-b752-65abf418c44d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1f8dc7e7-e82f-58b2-b752-65abf418c44d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "19baed69-c599-577a-97ef-8fe66fb30933"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "19baed69-c599-577a-97ef-8fe66fb30933",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "b2e5445c-ea64-5692-8eca-12906562eb53"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b2e5445c-ea64-5692-8eca-12906562eb53",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "b55baa38-798a-579e-ad9d-864759268c10"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b55baa38-798a-579e-ad9d-864759268c10",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "633308a5-eafb-5da1-938e-4f1161c0883d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "633308a5-eafb-5da1-938e-4f1161c0883d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "37c7dfef-fe56-5f8c-875c-97997740e332"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "37c7dfef-fe56-5f8c-875c-97997740e332",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "5bcd7c70-6470-5e07-970e-7ebdb0bdc7ad"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5bcd7c70-6470-5e07-970e-7ebdb0bdc7ad",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "6c08e111-b2b2-5d4c-949d-bd8a9ce9b031"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6c08e111-b2b2-5d4c-949d-bd8a9ce9b031",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "69428160-974e-5146-9be4-638e29732124"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "69428160-974e-5146-9be4-638e29732124",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "4bc9a8dc-4e63-4997-826f-62ba65ba63f4"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4bc9a8dc-4e63-4997-826f-62ba65ba63f4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4bc9a8dc-4e63-4997-826f-62ba65ba63f4",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "17b3377e-f901-4329-baec-1906eee56a81",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "c9cc1252-b8cc-4e59-972e-d95ba75770e8",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "607f43bf-b16b-4e2c-8311-57edeea5ee64"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "607f43bf-b16b-4e2c-8311-57edeea5ee64",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "607f43bf-b16b-4e2c-8311-57edeea5ee64",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "529be16d-dce3-42a3-a4ef-c7cc41cd5e9e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "7546c3f6-87ed-40e0-88f2-8ecea8814b4b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "55d230ba-3642-45ac-9d1b-64762b7993cc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "55d230ba-3642-45ac-9d1b-64762b7993cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "55d230ba-3642-45ac-9d1b-64762b7993cc",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "68cf0389-b627-45c9-8e16-83bcffa8e7c8",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "cfc54b4b-59b2-455a-83e8-291dee2f7855",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "761a79f9-0c14-4cbf-b270-59d4f98661c1"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "761a79f9-0c14-4cbf-b270-59d4f98661c1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:35.874 00:03:22 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:08:35.874 00:03:22 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:08:35.874 00:03:22 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:08:35.874 00:03:22 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 3468576 00:08:35.874 00:03:22 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 3468576 ']' 00:08:35.874 00:03:22 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 3468576 00:08:35.874 00:03:22 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:08:35.874 00:03:22 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:35.874 00:03:22 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3468576 00:08:35.874 00:03:22 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:35.874 00:03:22 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:35.874 00:03:22 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3468576' 00:08:35.874 killing process with pid 3468576 00:08:35.874 00:03:22 blockdev_general -- common/autotest_common.sh@967 -- # kill 3468576 00:08:35.874 00:03:22 blockdev_general -- common/autotest_common.sh@972 -- # wait 3468576 00:08:36.441 00:03:23 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:36.441 00:03:23 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:36.441 00:03:23 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:36.441 00:03:23 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.441 00:03:23 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:36.441 ************************************ 00:08:36.441 START TEST bdev_hello_world 00:08:36.441 ************************************ 00:08:36.441 00:03:23 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:36.441 [2024-07-16 00:03:23.326083] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:36.441 [2024-07-16 00:03:23.326146] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3468858 ] 00:08:36.700 [2024-07-16 00:03:23.452533] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.700 [2024-07-16 00:03:23.553051] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.959 [2024-07-16 00:03:23.715160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:36.959 [2024-07-16 00:03:23.715223] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:36.959 [2024-07-16 00:03:23.715238] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:36.959 [2024-07-16 00:03:23.723164] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:36.959 [2024-07-16 00:03:23.723192] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:36.959 [2024-07-16 00:03:23.731175] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:36.959 [2024-07-16 00:03:23.731200] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:36.959 [2024-07-16 00:03:23.808310] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:36.959 [2024-07-16 00:03:23.808362] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:36.959 [2024-07-16 00:03:23.808383] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e33c0 00:08:36.959 [2024-07-16 00:03:23.808397] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:36.959 [2024-07-16 00:03:23.809842] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:36.959 [2024-07-16 00:03:23.809872] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:37.217 [2024-07-16 00:03:23.963297] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:08:37.217 [2024-07-16 00:03:23.963364] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:08:37.217 [2024-07-16 00:03:23.963414] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:08:37.217 [2024-07-16 00:03:23.963485] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:08:37.217 [2024-07-16 00:03:23.963552] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:08:37.217 [2024-07-16 00:03:23.963589] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:08:37.217 [2024-07-16 00:03:23.963652] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:08:37.217 00:08:37.217 [2024-07-16 00:03:23.963690] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:08:37.476 00:08:37.476 real 0m1.040s 00:08:37.476 user 0m0.678s 00:08:37.476 sys 0m0.316s 00:08:37.476 00:03:24 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:37.476 00:03:24 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:08:37.476 ************************************ 00:08:37.476 END TEST bdev_hello_world 00:08:37.476 ************************************ 00:08:37.476 00:03:24 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:37.476 00:03:24 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:08:37.476 00:03:24 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:37.476 00:03:24 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.476 00:03:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:37.476 ************************************ 00:08:37.476 START TEST bdev_bounds 00:08:37.476 ************************************ 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3469053 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3469053' 00:08:37.476 Process bdevio pid: 3469053 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3469053 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 3469053 ']' 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:37.476 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:37.476 00:03:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:37.734 [2024-07-16 00:03:24.500368] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:37.734 [2024-07-16 00:03:24.500505] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3469053 ] 00:08:37.992 [2024-07-16 00:03:24.694133] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:37.992 [2024-07-16 00:03:24.797344] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:37.992 [2024-07-16 00:03:24.797447] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:37.992 [2024-07-16 00:03:24.797449] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.250 [2024-07-16 00:03:24.961067] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:38.250 [2024-07-16 00:03:24.961117] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:38.250 [2024-07-16 00:03:24.961131] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:38.250 [2024-07-16 00:03:24.969080] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:38.250 [2024-07-16 00:03:24.969109] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:38.250 [2024-07-16 00:03:24.977097] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:38.250 [2024-07-16 00:03:24.977130] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:38.250 [2024-07-16 00:03:25.054343] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:38.250 [2024-07-16 00:03:25.054397] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:38.250 [2024-07-16 00:03:25.054415] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10390c0 00:08:38.250 [2024-07-16 00:03:25.054428] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:38.250 [2024-07-16 00:03:25.055902] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:38.250 [2024-07-16 00:03:25.055953] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:38.508 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:38.508 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:08:38.508 00:03:25 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:08:38.767 I/O targets: 00:08:38.767 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:08:38.767 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:08:38.767 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:08:38.767 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:08:38.767 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:08:38.767 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:08:38.767 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:08:38.767 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:08:38.767 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:08:38.767 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:08:38.767 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:08:38.767 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:08:38.767 raid0: 131072 blocks of 512 bytes (64 MiB) 00:08:38.767 concat0: 131072 blocks of 512 bytes (64 MiB) 00:08:38.767 raid1: 65536 blocks of 512 bytes (32 MiB) 00:08:38.767 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:08:38.767 00:08:38.767 00:08:38.767 CUnit - A unit testing framework for C - Version 2.1-3 00:08:38.767 http://cunit.sourceforge.net/ 00:08:38.767 00:08:38.767 00:08:38.767 Suite: bdevio tests on: AIO0 00:08:38.767 Test: blockdev write read block ...passed 00:08:38.767 Test: blockdev write zeroes read block ...passed 00:08:38.767 Test: blockdev write zeroes read no split ...passed 00:08:38.767 Test: blockdev write zeroes read split ...passed 00:08:38.767 Test: blockdev write zeroes read split partial ...passed 00:08:38.767 Test: blockdev reset ...passed 00:08:38.767 Test: blockdev write read 8 blocks ...passed 00:08:38.767 Test: blockdev write read size > 128k ...passed 00:08:38.767 Test: blockdev write read invalid size ...passed 00:08:38.767 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:38.767 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:38.768 Test: blockdev write read max offset ...passed 00:08:38.768 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:38.768 Test: blockdev writev readv 8 blocks ...passed 00:08:38.768 Test: blockdev writev readv 30 x 1block ...passed 00:08:38.768 Test: blockdev writev readv block ...passed 00:08:38.768 Test: blockdev writev readv size > 128k ...passed 00:08:38.768 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:38.768 Test: blockdev comparev and writev ...passed 00:08:38.768 Test: blockdev nvme passthru rw ...passed 00:08:38.768 Test: blockdev nvme passthru vendor specific ...passed 00:08:38.768 Test: blockdev nvme admin passthru ...passed 00:08:38.768 Test: blockdev copy ...passed 00:08:38.768 Suite: bdevio tests on: raid1 00:08:38.768 Test: blockdev write read block ...passed 00:08:38.768 Test: blockdev write zeroes read block ...passed 00:08:38.768 Test: blockdev write zeroes read no split ...passed 00:08:38.768 Test: blockdev write zeroes read split ...passed 00:08:38.768 Test: blockdev write zeroes read split partial ...passed 00:08:38.768 Test: blockdev reset ...passed 00:08:38.768 Test: blockdev write read 8 blocks ...passed 00:08:38.768 Test: blockdev write read size > 128k ...passed 00:08:38.768 Test: blockdev write read invalid size ...passed 00:08:38.768 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:38.768 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:38.768 Test: blockdev write read max offset ...passed 00:08:38.768 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:38.768 Test: blockdev writev readv 8 blocks ...passed 00:08:38.768 Test: blockdev writev readv 30 x 1block ...passed 00:08:38.768 Test: blockdev writev readv block ...passed 00:08:38.768 Test: blockdev writev readv size > 128k ...passed 00:08:38.768 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:38.768 Test: blockdev comparev and writev ...passed 00:08:38.768 Test: blockdev nvme passthru rw ...passed 00:08:38.768 Test: blockdev nvme passthru vendor specific ...passed 00:08:38.768 Test: blockdev nvme admin passthru ...passed 00:08:38.768 Test: blockdev copy ...passed 00:08:38.768 Suite: bdevio tests on: concat0 00:08:38.768 Test: blockdev write read block ...passed 00:08:38.768 Test: blockdev write zeroes read block ...passed 00:08:38.768 Test: blockdev write zeroes read no split ...passed 00:08:38.768 Test: blockdev write zeroes read split ...passed 00:08:38.768 Test: blockdev write zeroes read split partial ...passed 00:08:38.768 Test: blockdev reset ...passed 00:08:38.768 Test: blockdev write read 8 blocks ...passed 00:08:38.768 Test: blockdev write read size > 128k ...passed 00:08:38.768 Test: blockdev write read invalid size ...passed 00:08:38.768 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:38.768 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:38.768 Test: blockdev write read max offset ...passed 00:08:38.768 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:38.768 Test: blockdev writev readv 8 blocks ...passed 00:08:38.768 Test: blockdev writev readv 30 x 1block ...passed 00:08:38.768 Test: blockdev writev readv block ...passed 00:08:38.768 Test: blockdev writev readv size > 128k ...passed 00:08:38.768 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:38.768 Test: blockdev comparev and writev ...passed 00:08:38.768 Test: blockdev nvme passthru rw ...passed 00:08:38.768 Test: blockdev nvme passthru vendor specific ...passed 00:08:38.768 Test: blockdev nvme admin passthru ...passed 00:08:38.768 Test: blockdev copy ...passed 00:08:38.768 Suite: bdevio tests on: raid0 00:08:38.768 Test: blockdev write read block ...passed 00:08:38.768 Test: blockdev write zeroes read block ...passed 00:08:38.768 Test: blockdev write zeroes read no split ...passed 00:08:38.768 Test: blockdev write zeroes read split ...passed 00:08:38.768 Test: blockdev write zeroes read split partial ...passed 00:08:38.768 Test: blockdev reset ...passed 00:08:38.768 Test: blockdev write read 8 blocks ...passed 00:08:38.768 Test: blockdev write read size > 128k ...passed 00:08:38.768 Test: blockdev write read invalid size ...passed 00:08:38.768 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:38.768 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:38.768 Test: blockdev write read max offset ...passed 00:08:38.768 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:38.768 Test: blockdev writev readv 8 blocks ...passed 00:08:38.768 Test: blockdev writev readv 30 x 1block ...passed 00:08:38.768 Test: blockdev writev readv block ...passed 00:08:38.768 Test: blockdev writev readv size > 128k ...passed 00:08:38.768 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:38.768 Test: blockdev comparev and writev ...passed 00:08:38.768 Test: blockdev nvme passthru rw ...passed 00:08:38.768 Test: blockdev nvme passthru vendor specific ...passed 00:08:38.768 Test: blockdev nvme admin passthru ...passed 00:08:38.768 Test: blockdev copy ...passed 00:08:38.768 Suite: bdevio tests on: TestPT 00:08:38.768 Test: blockdev write read block ...passed 00:08:38.768 Test: blockdev write zeroes read block ...passed 00:08:38.768 Test: blockdev write zeroes read no split ...passed 00:08:38.768 Test: blockdev write zeroes read split ...passed 00:08:38.768 Test: blockdev write zeroes read split partial ...passed 00:08:38.768 Test: blockdev reset ...passed 00:08:38.768 Test: blockdev write read 8 blocks ...passed 00:08:38.768 Test: blockdev write read size > 128k ...passed 00:08:38.768 Test: blockdev write read invalid size ...passed 00:08:38.768 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:38.768 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:38.768 Test: blockdev write read max offset ...passed 00:08:38.768 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:38.768 Test: blockdev writev readv 8 blocks ...passed 00:08:38.768 Test: blockdev writev readv 30 x 1block ...passed 00:08:38.768 Test: blockdev writev readv block ...passed 00:08:38.768 Test: blockdev writev readv size > 128k ...passed 00:08:38.768 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:38.768 Test: blockdev comparev and writev ...passed 00:08:38.768 Test: blockdev nvme passthru rw ...passed 00:08:38.768 Test: blockdev nvme passthru vendor specific ...passed 00:08:38.768 Test: blockdev nvme admin passthru ...passed 00:08:38.768 Test: blockdev copy ...passed 00:08:38.768 Suite: bdevio tests on: Malloc2p7 00:08:38.768 Test: blockdev write read block ...passed 00:08:38.768 Test: blockdev write zeroes read block ...passed 00:08:38.768 Test: blockdev write zeroes read no split ...passed 00:08:38.768 Test: blockdev write zeroes read split ...passed 00:08:38.768 Test: blockdev write zeroes read split partial ...passed 00:08:38.768 Test: blockdev reset ...passed 00:08:38.768 Test: blockdev write read 8 blocks ...passed 00:08:38.768 Test: blockdev write read size > 128k ...passed 00:08:38.768 Test: blockdev write read invalid size ...passed 00:08:38.768 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:38.768 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:38.768 Test: blockdev write read max offset ...passed 00:08:38.768 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:38.768 Test: blockdev writev readv 8 blocks ...passed 00:08:38.768 Test: blockdev writev readv 30 x 1block ...passed 00:08:38.768 Test: blockdev writev readv block ...passed 00:08:38.768 Test: blockdev writev readv size > 128k ...passed 00:08:38.768 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:38.768 Test: blockdev comparev and writev ...passed 00:08:38.768 Test: blockdev nvme passthru rw ...passed 00:08:38.768 Test: blockdev nvme passthru vendor specific ...passed 00:08:38.768 Test: blockdev nvme admin passthru ...passed 00:08:38.768 Test: blockdev copy ...passed 00:08:38.768 Suite: bdevio tests on: Malloc2p6 00:08:38.768 Test: blockdev write read block ...passed 00:08:38.768 Test: blockdev write zeroes read block ...passed 00:08:38.768 Test: blockdev write zeroes read no split ...passed 00:08:38.768 Test: blockdev write zeroes read split ...passed 00:08:38.768 Test: blockdev write zeroes read split partial ...passed 00:08:38.768 Test: blockdev reset ...passed 00:08:38.768 Test: blockdev write read 8 blocks ...passed 00:08:38.768 Test: blockdev write read size > 128k ...passed 00:08:38.768 Test: blockdev write read invalid size ...passed 00:08:38.768 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:38.768 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:38.768 Test: blockdev write read max offset ...passed 00:08:38.768 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:38.768 Test: blockdev writev readv 8 blocks ...passed 00:08:38.768 Test: blockdev writev readv 30 x 1block ...passed 00:08:38.768 Test: blockdev writev readv block ...passed 00:08:38.768 Test: blockdev writev readv size > 128k ...passed 00:08:38.768 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:38.768 Test: blockdev comparev and writev ...passed 00:08:38.768 Test: blockdev nvme passthru rw ...passed 00:08:38.768 Test: blockdev nvme passthru vendor specific ...passed 00:08:38.768 Test: blockdev nvme admin passthru ...passed 00:08:38.768 Test: blockdev copy ...passed 00:08:38.768 Suite: bdevio tests on: Malloc2p5 00:08:38.768 Test: blockdev write read block ...passed 00:08:38.768 Test: blockdev write zeroes read block ...passed 00:08:38.768 Test: blockdev write zeroes read no split ...passed 00:08:38.768 Test: blockdev write zeroes read split ...passed 00:08:38.768 Test: blockdev write zeroes read split partial ...passed 00:08:38.768 Test: blockdev reset ...passed 00:08:38.768 Test: blockdev write read 8 blocks ...passed 00:08:38.768 Test: blockdev write read size > 128k ...passed 00:08:38.768 Test: blockdev write read invalid size ...passed 00:08:38.768 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:38.768 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:38.768 Test: blockdev write read max offset ...passed 00:08:38.768 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:38.768 Test: blockdev writev readv 8 blocks ...passed 00:08:38.768 Test: blockdev writev readv 30 x 1block ...passed 00:08:38.768 Test: blockdev writev readv block ...passed 00:08:38.768 Test: blockdev writev readv size > 128k ...passed 00:08:38.768 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:38.768 Test: blockdev comparev and writev ...passed 00:08:38.768 Test: blockdev nvme passthru rw ...passed 00:08:38.768 Test: blockdev nvme passthru vendor specific ...passed 00:08:38.768 Test: blockdev nvme admin passthru ...passed 00:08:38.768 Test: blockdev copy ...passed 00:08:38.768 Suite: bdevio tests on: Malloc2p4 00:08:38.768 Test: blockdev write read block ...passed 00:08:38.768 Test: blockdev write zeroes read block ...passed 00:08:38.768 Test: blockdev write zeroes read no split ...passed 00:08:38.768 Test: blockdev write zeroes read split ...passed 00:08:39.028 Test: blockdev write zeroes read split partial ...passed 00:08:39.028 Test: blockdev reset ...passed 00:08:39.028 Test: blockdev write read 8 blocks ...passed 00:08:39.028 Test: blockdev write read size > 128k ...passed 00:08:39.028 Test: blockdev write read invalid size ...passed 00:08:39.028 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:39.028 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:39.028 Test: blockdev write read max offset ...passed 00:08:39.028 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:39.028 Test: blockdev writev readv 8 blocks ...passed 00:08:39.028 Test: blockdev writev readv 30 x 1block ...passed 00:08:39.028 Test: blockdev writev readv block ...passed 00:08:39.028 Test: blockdev writev readv size > 128k ...passed 00:08:39.028 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:39.028 Test: blockdev comparev and writev ...passed 00:08:39.028 Test: blockdev nvme passthru rw ...passed 00:08:39.028 Test: blockdev nvme passthru vendor specific ...passed 00:08:39.028 Test: blockdev nvme admin passthru ...passed 00:08:39.028 Test: blockdev copy ...passed 00:08:39.028 Suite: bdevio tests on: Malloc2p3 00:08:39.028 Test: blockdev write read block ...passed 00:08:39.028 Test: blockdev write zeroes read block ...passed 00:08:39.028 Test: blockdev write zeroes read no split ...passed 00:08:39.028 Test: blockdev write zeroes read split ...passed 00:08:39.028 Test: blockdev write zeroes read split partial ...passed 00:08:39.028 Test: blockdev reset ...passed 00:08:39.028 Test: blockdev write read 8 blocks ...passed 00:08:39.028 Test: blockdev write read size > 128k ...passed 00:08:39.028 Test: blockdev write read invalid size ...passed 00:08:39.028 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:39.028 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:39.028 Test: blockdev write read max offset ...passed 00:08:39.028 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:39.028 Test: blockdev writev readv 8 blocks ...passed 00:08:39.028 Test: blockdev writev readv 30 x 1block ...passed 00:08:39.028 Test: blockdev writev readv block ...passed 00:08:39.028 Test: blockdev writev readv size > 128k ...passed 00:08:39.028 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:39.028 Test: blockdev comparev and writev ...passed 00:08:39.028 Test: blockdev nvme passthru rw ...passed 00:08:39.028 Test: blockdev nvme passthru vendor specific ...passed 00:08:39.028 Test: blockdev nvme admin passthru ...passed 00:08:39.028 Test: blockdev copy ...passed 00:08:39.028 Suite: bdevio tests on: Malloc2p2 00:08:39.028 Test: blockdev write read block ...passed 00:08:39.028 Test: blockdev write zeroes read block ...passed 00:08:39.028 Test: blockdev write zeroes read no split ...passed 00:08:39.028 Test: blockdev write zeroes read split ...passed 00:08:39.028 Test: blockdev write zeroes read split partial ...passed 00:08:39.028 Test: blockdev reset ...passed 00:08:39.028 Test: blockdev write read 8 blocks ...passed 00:08:39.028 Test: blockdev write read size > 128k ...passed 00:08:39.028 Test: blockdev write read invalid size ...passed 00:08:39.028 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:39.028 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:39.028 Test: blockdev write read max offset ...passed 00:08:39.028 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:39.028 Test: blockdev writev readv 8 blocks ...passed 00:08:39.028 Test: blockdev writev readv 30 x 1block ...passed 00:08:39.028 Test: blockdev writev readv block ...passed 00:08:39.028 Test: blockdev writev readv size > 128k ...passed 00:08:39.028 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:39.028 Test: blockdev comparev and writev ...passed 00:08:39.028 Test: blockdev nvme passthru rw ...passed 00:08:39.028 Test: blockdev nvme passthru vendor specific ...passed 00:08:39.028 Test: blockdev nvme admin passthru ...passed 00:08:39.028 Test: blockdev copy ...passed 00:08:39.028 Suite: bdevio tests on: Malloc2p1 00:08:39.028 Test: blockdev write read block ...passed 00:08:39.028 Test: blockdev write zeroes read block ...passed 00:08:39.028 Test: blockdev write zeroes read no split ...passed 00:08:39.028 Test: blockdev write zeroes read split ...passed 00:08:39.028 Test: blockdev write zeroes read split partial ...passed 00:08:39.028 Test: blockdev reset ...passed 00:08:39.028 Test: blockdev write read 8 blocks ...passed 00:08:39.028 Test: blockdev write read size > 128k ...passed 00:08:39.028 Test: blockdev write read invalid size ...passed 00:08:39.028 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:39.028 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:39.028 Test: blockdev write read max offset ...passed 00:08:39.028 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:39.028 Test: blockdev writev readv 8 blocks ...passed 00:08:39.028 Test: blockdev writev readv 30 x 1block ...passed 00:08:39.028 Test: blockdev writev readv block ...passed 00:08:39.028 Test: blockdev writev readv size > 128k ...passed 00:08:39.028 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:39.028 Test: blockdev comparev and writev ...passed 00:08:39.028 Test: blockdev nvme passthru rw ...passed 00:08:39.028 Test: blockdev nvme passthru vendor specific ...passed 00:08:39.028 Test: blockdev nvme admin passthru ...passed 00:08:39.028 Test: blockdev copy ...passed 00:08:39.028 Suite: bdevio tests on: Malloc2p0 00:08:39.028 Test: blockdev write read block ...passed 00:08:39.028 Test: blockdev write zeroes read block ...passed 00:08:39.028 Test: blockdev write zeroes read no split ...passed 00:08:39.028 Test: blockdev write zeroes read split ...passed 00:08:39.028 Test: blockdev write zeroes read split partial ...passed 00:08:39.028 Test: blockdev reset ...passed 00:08:39.028 Test: blockdev write read 8 blocks ...passed 00:08:39.028 Test: blockdev write read size > 128k ...passed 00:08:39.028 Test: blockdev write read invalid size ...passed 00:08:39.028 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:39.028 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:39.028 Test: blockdev write read max offset ...passed 00:08:39.028 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:39.028 Test: blockdev writev readv 8 blocks ...passed 00:08:39.028 Test: blockdev writev readv 30 x 1block ...passed 00:08:39.028 Test: blockdev writev readv block ...passed 00:08:39.028 Test: blockdev writev readv size > 128k ...passed 00:08:39.028 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:39.028 Test: blockdev comparev and writev ...passed 00:08:39.028 Test: blockdev nvme passthru rw ...passed 00:08:39.028 Test: blockdev nvme passthru vendor specific ...passed 00:08:39.028 Test: blockdev nvme admin passthru ...passed 00:08:39.028 Test: blockdev copy ...passed 00:08:39.028 Suite: bdevio tests on: Malloc1p1 00:08:39.028 Test: blockdev write read block ...passed 00:08:39.028 Test: blockdev write zeroes read block ...passed 00:08:39.028 Test: blockdev write zeroes read no split ...passed 00:08:39.028 Test: blockdev write zeroes read split ...passed 00:08:39.028 Test: blockdev write zeroes read split partial ...passed 00:08:39.028 Test: blockdev reset ...passed 00:08:39.028 Test: blockdev write read 8 blocks ...passed 00:08:39.028 Test: blockdev write read size > 128k ...passed 00:08:39.028 Test: blockdev write read invalid size ...passed 00:08:39.028 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:39.028 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:39.028 Test: blockdev write read max offset ...passed 00:08:39.028 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:39.028 Test: blockdev writev readv 8 blocks ...passed 00:08:39.028 Test: blockdev writev readv 30 x 1block ...passed 00:08:39.028 Test: blockdev writev readv block ...passed 00:08:39.028 Test: blockdev writev readv size > 128k ...passed 00:08:39.028 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:39.028 Test: blockdev comparev and writev ...passed 00:08:39.028 Test: blockdev nvme passthru rw ...passed 00:08:39.028 Test: blockdev nvme passthru vendor specific ...passed 00:08:39.028 Test: blockdev nvme admin passthru ...passed 00:08:39.028 Test: blockdev copy ...passed 00:08:39.028 Suite: bdevio tests on: Malloc1p0 00:08:39.028 Test: blockdev write read block ...passed 00:08:39.028 Test: blockdev write zeroes read block ...passed 00:08:39.028 Test: blockdev write zeroes read no split ...passed 00:08:39.028 Test: blockdev write zeroes read split ...passed 00:08:39.028 Test: blockdev write zeroes read split partial ...passed 00:08:39.028 Test: blockdev reset ...passed 00:08:39.028 Test: blockdev write read 8 blocks ...passed 00:08:39.029 Test: blockdev write read size > 128k ...passed 00:08:39.029 Test: blockdev write read invalid size ...passed 00:08:39.029 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:39.029 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:39.029 Test: blockdev write read max offset ...passed 00:08:39.029 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:39.029 Test: blockdev writev readv 8 blocks ...passed 00:08:39.029 Test: blockdev writev readv 30 x 1block ...passed 00:08:39.029 Test: blockdev writev readv block ...passed 00:08:39.029 Test: blockdev writev readv size > 128k ...passed 00:08:39.029 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:39.029 Test: blockdev comparev and writev ...passed 00:08:39.029 Test: blockdev nvme passthru rw ...passed 00:08:39.029 Test: blockdev nvme passthru vendor specific ...passed 00:08:39.029 Test: blockdev nvme admin passthru ...passed 00:08:39.029 Test: blockdev copy ...passed 00:08:39.029 Suite: bdevio tests on: Malloc0 00:08:39.029 Test: blockdev write read block ...passed 00:08:39.029 Test: blockdev write zeroes read block ...passed 00:08:39.029 Test: blockdev write zeroes read no split ...passed 00:08:39.029 Test: blockdev write zeroes read split ...passed 00:08:39.029 Test: blockdev write zeroes read split partial ...passed 00:08:39.029 Test: blockdev reset ...passed 00:08:39.029 Test: blockdev write read 8 blocks ...passed 00:08:39.029 Test: blockdev write read size > 128k ...passed 00:08:39.029 Test: blockdev write read invalid size ...passed 00:08:39.029 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:39.029 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:39.029 Test: blockdev write read max offset ...passed 00:08:39.029 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:39.029 Test: blockdev writev readv 8 blocks ...passed 00:08:39.029 Test: blockdev writev readv 30 x 1block ...passed 00:08:39.029 Test: blockdev writev readv block ...passed 00:08:39.029 Test: blockdev writev readv size > 128k ...passed 00:08:39.029 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:39.029 Test: blockdev comparev and writev ...passed 00:08:39.029 Test: blockdev nvme passthru rw ...passed 00:08:39.029 Test: blockdev nvme passthru vendor specific ...passed 00:08:39.029 Test: blockdev nvme admin passthru ...passed 00:08:39.029 Test: blockdev copy ...passed 00:08:39.029 00:08:39.029 Run Summary: Type Total Ran Passed Failed Inactive 00:08:39.029 suites 16 16 n/a 0 0 00:08:39.029 tests 368 368 368 0 0 00:08:39.029 asserts 2224 2224 2224 0 n/a 00:08:39.029 00:08:39.029 Elapsed time = 0.664 seconds 00:08:39.029 0 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3469053 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 3469053 ']' 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 3469053 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3469053 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3469053' 00:08:39.029 killing process with pid 3469053 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 3469053 00:08:39.029 00:03:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 3469053 00:08:39.287 00:03:26 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:08:39.287 00:08:39.287 real 0m1.820s 00:08:39.287 user 0m4.235s 00:08:39.287 sys 0m0.576s 00:08:39.287 00:03:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.287 00:03:26 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:39.287 ************************************ 00:08:39.287 END TEST bdev_bounds 00:08:39.287 ************************************ 00:08:39.545 00:03:26 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:39.545 00:03:26 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:39.545 00:03:26 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:39.545 00:03:26 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.545 00:03:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:39.545 ************************************ 00:08:39.545 START TEST bdev_nbd 00:08:39.545 ************************************ 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3469266 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3469266 /var/tmp/spdk-nbd.sock 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 3469266 ']' 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:39.545 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:39.545 00:03:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:39.545 [2024-07-16 00:03:26.370797] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:08:39.545 [2024-07-16 00:03:26.370869] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:39.802 [2024-07-16 00:03:26.501991] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.802 [2024-07-16 00:03:26.604328] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.060 [2024-07-16 00:03:26.753972] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:40.060 [2024-07-16 00:03:26.754029] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:40.060 [2024-07-16 00:03:26.754045] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:40.060 [2024-07-16 00:03:26.761978] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:40.060 [2024-07-16 00:03:26.762006] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:40.060 [2024-07-16 00:03:26.769987] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:40.060 [2024-07-16 00:03:26.770012] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:40.060 [2024-07-16 00:03:26.842246] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:40.060 [2024-07-16 00:03:26.842296] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:40.060 [2024-07-16 00:03:26.842313] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x126aa40 00:08:40.060 [2024-07-16 00:03:26.842325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:40.060 [2024-07-16 00:03:26.843762] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:40.060 [2024-07-16 00:03:26.843793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:40.319 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:40.577 1+0 records in 00:08:40.577 1+0 records out 00:08:40.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002657 s, 15.4 MB/s 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:40.577 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.836 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:40.836 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:40.836 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:40.836 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:40.836 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:41.095 1+0 records in 00:08:41.095 1+0 records out 00:08:41.095 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269805 s, 15.2 MB/s 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:41.095 00:03:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:08:41.354 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:08:41.354 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:08:41.354 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:08:41.354 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:41.354 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:41.354 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:41.354 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:41.354 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:41.355 1+0 records in 00:08:41.355 1+0 records out 00:08:41.355 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000305519 s, 13.4 MB/s 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:41.355 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:41.614 1+0 records in 00:08:41.614 1+0 records out 00:08:41.614 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328746 s, 12.5 MB/s 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:41.614 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:41.874 1+0 records in 00:08:41.874 1+0 records out 00:08:41.874 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000433869 s, 9.4 MB/s 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:41.874 00:03:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:42.444 1+0 records in 00:08:42.444 1+0 records out 00:08:42.444 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000427174 s, 9.6 MB/s 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:42.444 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:42.704 1+0 records in 00:08:42.704 1+0 records out 00:08:42.704 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000494881 s, 8.3 MB/s 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:42.704 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:42.963 1+0 records in 00:08:42.963 1+0 records out 00:08:42.963 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000484699 s, 8.5 MB/s 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:42.963 00:03:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:43.223 1+0 records in 00:08:43.223 1+0 records out 00:08:43.223 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000379968 s, 10.8 MB/s 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:43.223 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:43.483 1+0 records in 00:08:43.483 1+0 records out 00:08:43.483 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000467569 s, 8.8 MB/s 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:43.483 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:43.743 1+0 records in 00:08:43.743 1+0 records out 00:08:43.743 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000701742 s, 5.8 MB/s 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:43.743 00:03:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.312 1+0 records in 00:08:44.312 1+0 records out 00:08:44.312 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000526727 s, 7.8 MB/s 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:44.312 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.571 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.571 1+0 records in 00:08:44.571 1+0 records out 00:08:44.571 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000568549 s, 7.2 MB/s 00:08:44.572 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.572 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:44.572 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.572 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.572 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:44.572 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:44.572 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:44.572 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.830 1+0 records in 00:08:44.830 1+0 records out 00:08:44.830 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000696705 s, 5.9 MB/s 00:08:44.830 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.831 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:44.831 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.831 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.831 00:03:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:44.831 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:44.831 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:44.831 00:03:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.089 1+0 records in 00:08:45.089 1+0 records out 00:08:45.089 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000565355 s, 7.2 MB/s 00:08:45.089 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.352 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:45.352 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.352 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:45.352 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:45.352 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:45.352 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:45.352 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:45.352 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.611 1+0 records in 00:08:45.611 1+0 records out 00:08:45.611 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000851688 s, 4.8 MB/s 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:45.611 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:45.871 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd0", 00:08:45.871 "bdev_name": "Malloc0" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd1", 00:08:45.871 "bdev_name": "Malloc1p0" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd2", 00:08:45.871 "bdev_name": "Malloc1p1" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd3", 00:08:45.871 "bdev_name": "Malloc2p0" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd4", 00:08:45.871 "bdev_name": "Malloc2p1" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd5", 00:08:45.871 "bdev_name": "Malloc2p2" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd6", 00:08:45.871 "bdev_name": "Malloc2p3" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd7", 00:08:45.871 "bdev_name": "Malloc2p4" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd8", 00:08:45.871 "bdev_name": "Malloc2p5" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd9", 00:08:45.871 "bdev_name": "Malloc2p6" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd10", 00:08:45.871 "bdev_name": "Malloc2p7" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd11", 00:08:45.871 "bdev_name": "TestPT" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd12", 00:08:45.871 "bdev_name": "raid0" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd13", 00:08:45.871 "bdev_name": "concat0" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd14", 00:08:45.871 "bdev_name": "raid1" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd15", 00:08:45.871 "bdev_name": "AIO0" 00:08:45.871 } 00:08:45.871 ]' 00:08:45.871 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:45.871 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd0", 00:08:45.871 "bdev_name": "Malloc0" 00:08:45.871 }, 00:08:45.871 { 00:08:45.871 "nbd_device": "/dev/nbd1", 00:08:45.871 "bdev_name": "Malloc1p0" 00:08:45.871 }, 00:08:45.871 { 00:08:45.872 "nbd_device": "/dev/nbd2", 00:08:45.872 "bdev_name": "Malloc1p1" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd3", 00:08:45.872 "bdev_name": "Malloc2p0" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd4", 00:08:45.872 "bdev_name": "Malloc2p1" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd5", 00:08:45.872 "bdev_name": "Malloc2p2" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd6", 00:08:45.872 "bdev_name": "Malloc2p3" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd7", 00:08:45.872 "bdev_name": "Malloc2p4" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd8", 00:08:45.872 "bdev_name": "Malloc2p5" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd9", 00:08:45.872 "bdev_name": "Malloc2p6" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd10", 00:08:45.872 "bdev_name": "Malloc2p7" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd11", 00:08:45.872 "bdev_name": "TestPT" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd12", 00:08:45.872 "bdev_name": "raid0" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd13", 00:08:45.872 "bdev_name": "concat0" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd14", 00:08:45.872 "bdev_name": "raid1" 00:08:45.872 }, 00:08:45.872 { 00:08:45.872 "nbd_device": "/dev/nbd15", 00:08:45.872 "bdev_name": "AIO0" 00:08:45.872 } 00:08:45.872 ]' 00:08:45.872 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:45.872 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:45.872 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:45.872 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:45.872 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:45.872 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:45.872 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:45.872 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:46.131 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:46.131 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:46.131 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:46.131 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.131 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.131 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:46.131 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.131 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.131 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.131 00:03:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:46.391 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:46.391 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:46.391 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:46.391 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.391 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.391 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:46.391 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.391 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.391 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.391 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:46.649 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:46.649 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:46.649 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:46.649 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.649 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.649 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:46.649 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.649 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.649 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.649 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:46.907 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:46.907 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:46.907 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:46.907 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.907 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.907 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:46.907 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.907 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.907 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.908 00:03:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:47.166 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:47.166 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:47.166 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:47.166 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.166 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.166 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:47.166 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.166 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.166 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.166 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:47.430 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:47.430 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:47.430 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:47.430 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.430 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.430 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:47.430 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.430 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.430 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.430 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:47.757 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:47.757 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:47.757 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:47.757 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.757 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.757 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:47.757 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.757 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.757 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.757 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:48.016 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:48.016 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:48.016 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:48.016 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.016 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.016 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:48.016 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.016 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.016 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.016 00:03:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:48.275 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:48.275 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:48.275 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:48.275 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.275 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.275 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:48.275 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.275 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.275 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.275 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:48.534 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:48.534 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:48.534 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:48.534 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.534 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.534 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:48.534 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.534 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.534 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.534 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:48.793 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:48.793 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:48.793 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:48.793 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.793 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.793 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:48.793 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.793 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.793 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.793 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:49.051 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:49.051 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:49.051 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:49.051 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.051 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.051 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:49.051 00:03:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.051 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.051 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.051 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:49.619 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:49.620 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:49.620 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:49.620 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.620 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.620 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:49.620 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.620 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.620 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.620 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:49.879 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:50.138 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:50.138 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:50.138 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.138 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.138 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:50.138 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.138 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.138 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.138 00:03:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:50.397 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:50.397 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:50.397 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:50.397 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.397 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.397 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:50.397 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.397 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.398 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:50.398 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:50.398 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:50.657 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:50.917 /dev/nbd0 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:50.917 1+0 records in 00:08:50.917 1+0 records out 00:08:50.917 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263019 s, 15.6 MB/s 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:50.917 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:51.176 /dev/nbd1 00:08:51.176 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:51.176 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:51.176 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:51.176 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:51.176 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:51.176 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:51.177 1+0 records in 00:08:51.177 1+0 records out 00:08:51.177 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254151 s, 16.1 MB/s 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:51.177 00:03:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:51.436 /dev/nbd10 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:51.436 1+0 records in 00:08:51.436 1+0 records out 00:08:51.436 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000340345 s, 12.0 MB/s 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:51.436 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:51.696 /dev/nbd11 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:51.696 1+0 records in 00:08:51.696 1+0 records out 00:08:51.696 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00039003 s, 10.5 MB/s 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:51.696 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:08:51.956 /dev/nbd12 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:51.956 1+0 records in 00:08:51.956 1+0 records out 00:08:51.956 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000407718 s, 10.0 MB/s 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:51.956 00:03:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:08:52.215 /dev/nbd13 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:52.215 1+0 records in 00:08:52.215 1+0 records out 00:08:52.215 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000426255 s, 9.6 MB/s 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:52.215 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:08:52.474 /dev/nbd14 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:52.474 1+0 records in 00:08:52.474 1+0 records out 00:08:52.474 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000380679 s, 10.8 MB/s 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:52.474 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:08:52.733 /dev/nbd15 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:52.733 1+0 records in 00:08:52.733 1+0 records out 00:08:52.733 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000448879 s, 9.1 MB/s 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:52.733 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:52.734 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:52.734 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:52.734 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:08:52.991 /dev/nbd2 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:52.991 1+0 records in 00:08:52.991 1+0 records out 00:08:52.991 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000493835 s, 8.3 MB/s 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:52.991 00:03:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:08:53.249 /dev/nbd3 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:53.249 1+0 records in 00:08:53.249 1+0 records out 00:08:53.249 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00062556 s, 6.5 MB/s 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:53.249 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:08:53.507 /dev/nbd4 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:53.507 1+0 records in 00:08:53.507 1+0 records out 00:08:53.507 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000677054 s, 6.0 MB/s 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:53.507 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:08:53.766 /dev/nbd5 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:53.766 1+0 records in 00:08:53.766 1+0 records out 00:08:53.766 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000508462 s, 8.1 MB/s 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:53.766 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:08:54.024 /dev/nbd6 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:54.024 1+0 records in 00:08:54.024 1+0 records out 00:08:54.024 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000725476 s, 5.6 MB/s 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:54.024 00:03:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:08:54.282 /dev/nbd7 00:08:54.282 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:54.283 1+0 records in 00:08:54.283 1+0 records out 00:08:54.283 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000765764 s, 5.3 MB/s 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:54.283 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.542 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:54.542 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:54.542 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:54.542 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:54.542 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:08:54.542 /dev/nbd8 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:54.800 1+0 records in 00:08:54.800 1+0 records out 00:08:54.800 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000608747 s, 6.7 MB/s 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:54.800 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:08:55.059 /dev/nbd9 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:55.059 1+0 records in 00:08:55.059 1+0 records out 00:08:55.059 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000790258 s, 5.2 MB/s 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:55.059 00:03:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:55.059 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd0", 00:08:55.059 "bdev_name": "Malloc0" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd1", 00:08:55.059 "bdev_name": "Malloc1p0" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd10", 00:08:55.059 "bdev_name": "Malloc1p1" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd11", 00:08:55.059 "bdev_name": "Malloc2p0" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd12", 00:08:55.059 "bdev_name": "Malloc2p1" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd13", 00:08:55.059 "bdev_name": "Malloc2p2" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd14", 00:08:55.059 "bdev_name": "Malloc2p3" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd15", 00:08:55.059 "bdev_name": "Malloc2p4" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd2", 00:08:55.059 "bdev_name": "Malloc2p5" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd3", 00:08:55.059 "bdev_name": "Malloc2p6" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd4", 00:08:55.059 "bdev_name": "Malloc2p7" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd5", 00:08:55.059 "bdev_name": "TestPT" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd6", 00:08:55.059 "bdev_name": "raid0" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd7", 00:08:55.059 "bdev_name": "concat0" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd8", 00:08:55.059 "bdev_name": "raid1" 00:08:55.059 }, 00:08:55.059 { 00:08:55.059 "nbd_device": "/dev/nbd9", 00:08:55.059 "bdev_name": "AIO0" 00:08:55.059 } 00:08:55.059 ]' 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd0", 00:08:55.319 "bdev_name": "Malloc0" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd1", 00:08:55.319 "bdev_name": "Malloc1p0" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd10", 00:08:55.319 "bdev_name": "Malloc1p1" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd11", 00:08:55.319 "bdev_name": "Malloc2p0" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd12", 00:08:55.319 "bdev_name": "Malloc2p1" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd13", 00:08:55.319 "bdev_name": "Malloc2p2" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd14", 00:08:55.319 "bdev_name": "Malloc2p3" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd15", 00:08:55.319 "bdev_name": "Malloc2p4" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd2", 00:08:55.319 "bdev_name": "Malloc2p5" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd3", 00:08:55.319 "bdev_name": "Malloc2p6" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd4", 00:08:55.319 "bdev_name": "Malloc2p7" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd5", 00:08:55.319 "bdev_name": "TestPT" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd6", 00:08:55.319 "bdev_name": "raid0" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd7", 00:08:55.319 "bdev_name": "concat0" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd8", 00:08:55.319 "bdev_name": "raid1" 00:08:55.319 }, 00:08:55.319 { 00:08:55.319 "nbd_device": "/dev/nbd9", 00:08:55.319 "bdev_name": "AIO0" 00:08:55.319 } 00:08:55.319 ]' 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:55.319 /dev/nbd1 00:08:55.319 /dev/nbd10 00:08:55.319 /dev/nbd11 00:08:55.319 /dev/nbd12 00:08:55.319 /dev/nbd13 00:08:55.319 /dev/nbd14 00:08:55.319 /dev/nbd15 00:08:55.319 /dev/nbd2 00:08:55.319 /dev/nbd3 00:08:55.319 /dev/nbd4 00:08:55.319 /dev/nbd5 00:08:55.319 /dev/nbd6 00:08:55.319 /dev/nbd7 00:08:55.319 /dev/nbd8 00:08:55.319 /dev/nbd9' 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:55.319 /dev/nbd1 00:08:55.319 /dev/nbd10 00:08:55.319 /dev/nbd11 00:08:55.319 /dev/nbd12 00:08:55.319 /dev/nbd13 00:08:55.319 /dev/nbd14 00:08:55.319 /dev/nbd15 00:08:55.319 /dev/nbd2 00:08:55.319 /dev/nbd3 00:08:55.319 /dev/nbd4 00:08:55.319 /dev/nbd5 00:08:55.319 /dev/nbd6 00:08:55.319 /dev/nbd7 00:08:55.319 /dev/nbd8 00:08:55.319 /dev/nbd9' 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:08:55.319 256+0 records in 00:08:55.319 256+0 records out 00:08:55.319 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114304 s, 91.7 MB/s 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:55.319 256+0 records in 00:08:55.319 256+0 records out 00:08:55.319 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181809 s, 5.8 MB/s 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:55.319 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:55.578 256+0 records in 00:08:55.578 256+0 records out 00:08:55.578 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184592 s, 5.7 MB/s 00:08:55.578 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:55.578 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:08:55.837 256+0 records in 00:08:55.837 256+0 records out 00:08:55.837 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184331 s, 5.7 MB/s 00:08:55.837 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:55.837 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:08:56.097 256+0 records in 00:08:56.097 256+0 records out 00:08:56.097 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18471 s, 5.7 MB/s 00:08:56.097 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:56.097 00:03:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:08:56.097 256+0 records in 00:08:56.097 256+0 records out 00:08:56.097 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183798 s, 5.7 MB/s 00:08:56.097 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:56.097 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:08:56.355 256+0 records in 00:08:56.355 256+0 records out 00:08:56.355 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18389 s, 5.7 MB/s 00:08:56.355 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:56.355 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:08:56.614 256+0 records in 00:08:56.614 256+0 records out 00:08:56.614 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184258 s, 5.7 MB/s 00:08:56.614 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:56.614 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:08:56.871 256+0 records in 00:08:56.871 256+0 records out 00:08:56.872 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184621 s, 5.7 MB/s 00:08:56.872 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:56.872 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:08:56.872 256+0 records in 00:08:56.872 256+0 records out 00:08:56.872 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184059 s, 5.7 MB/s 00:08:56.872 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:56.872 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:08:57.130 256+0 records in 00:08:57.130 256+0 records out 00:08:57.130 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184567 s, 5.7 MB/s 00:08:57.130 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.130 00:03:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:08:57.387 256+0 records in 00:08:57.387 256+0 records out 00:08:57.387 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184172 s, 5.7 MB/s 00:08:57.387 00:03:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.387 00:03:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:08:57.645 256+0 records in 00:08:57.645 256+0 records out 00:08:57.645 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184381 s, 5.7 MB/s 00:08:57.645 00:03:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.645 00:03:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:08:57.645 256+0 records in 00:08:57.645 256+0 records out 00:08:57.645 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184957 s, 5.7 MB/s 00:08:57.645 00:03:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.645 00:03:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:08:57.903 256+0 records in 00:08:57.903 256+0 records out 00:08:57.903 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185321 s, 5.7 MB/s 00:08:57.903 00:03:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.903 00:03:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:08:58.162 256+0 records in 00:08:58.162 256+0 records out 00:08:58.162 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.188826 s, 5.6 MB/s 00:08:58.162 00:03:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:58.162 00:03:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:08:58.162 256+0 records in 00:08:58.162 256+0 records out 00:08:58.162 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182404 s, 5.7 MB/s 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.162 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:08:58.421 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:58.422 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:58.680 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:58.680 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:58.680 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:58.680 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:58.680 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:58.680 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:58.680 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:58.680 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:58.680 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:58.680 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:58.938 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:58.938 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:58.938 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:58.938 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:58.938 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:58.938 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:58.938 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:58.938 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:58.938 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:58.938 00:03:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:59.196 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:59.462 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:59.462 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:59.462 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:59.462 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:59.462 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:59.462 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:59.462 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:59.462 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:59.462 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:59.462 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:59.737 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:59.738 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:59.738 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:59.738 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:59.738 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:59.738 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:59.738 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:59.738 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:59.738 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:59.996 00:03:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:00.255 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:00.513 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:00.513 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:00.513 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:00.513 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:00.513 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:00.513 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:00.513 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:00.513 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:00.513 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:00.780 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:00.780 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:00.780 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:00.780 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:00.780 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:00.780 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:00.781 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:00.781 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:00.781 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:00.781 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:01.039 00:03:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:01.652 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:01.911 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:01.911 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:01.911 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:01.911 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:01.911 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:01.911 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:01.911 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:01.911 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:01.911 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:01.911 00:03:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:02.170 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:02.170 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:02.170 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:02.170 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:02.170 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:02.170 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:02.170 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:02.170 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:02.170 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:02.170 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:02.429 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:02.429 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:02.429 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:02.429 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:02.429 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:02.429 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:02.429 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:02.429 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:02.429 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:02.429 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:02.688 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:02.947 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:02.947 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:02.947 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:02.947 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:02.947 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:02.947 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:02.947 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:09:03.207 00:03:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:09:03.207 malloc_lvol_verify 00:09:03.207 00:03:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:09:03.466 08253a42-06da-4457-8e01-7e287a0269af 00:09:03.466 00:03:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:09:04.035 b820462d-3000-4b17-9515-ef05223feb08 00:09:04.035 00:03:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:09:04.603 /dev/nbd0 00:09:04.603 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:09:04.603 mke2fs 1.46.5 (30-Dec-2021) 00:09:04.603 Discarding device blocks: 0/4096 done 00:09:04.603 Creating filesystem with 4096 1k blocks and 1024 inodes 00:09:04.603 00:09:04.603 Allocating group tables: 0/1 done 00:09:04.603 Writing inode tables: 0/1 done 00:09:04.603 Creating journal (1024 blocks): done 00:09:04.603 Writing superblocks and filesystem accounting information: 0/1 done 00:09:04.603 00:09:04.603 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:09:04.603 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:09:04.603 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:04.603 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:04.603 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:04.603 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:04.603 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:04.603 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3469266 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 3469266 ']' 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 3469266 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3469266 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3469266' 00:09:04.860 killing process with pid 3469266 00:09:04.860 00:03:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 3469266 00:09:04.861 00:03:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 3469266 00:09:05.426 00:03:52 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:09:05.426 00:09:05.426 real 0m25.813s 00:09:05.426 user 0m32.144s 00:09:05.426 sys 0m14.928s 00:09:05.426 00:03:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:05.426 00:03:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:05.426 ************************************ 00:09:05.426 END TEST bdev_nbd 00:09:05.426 ************************************ 00:09:05.426 00:03:52 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:05.426 00:03:52 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:09:05.426 00:03:52 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:09:05.426 00:03:52 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:09:05.426 00:03:52 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:09:05.426 00:03:52 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:05.426 00:03:52 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:05.426 00:03:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.426 ************************************ 00:09:05.426 START TEST bdev_fio 00:09:05.426 ************************************ 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:05.426 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:09:05.426 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:05.427 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:09:05.427 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:09:05.427 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:09:05.427 00:03:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:05.427 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:05.427 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:05.427 00:03:52 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:05.427 ************************************ 00:09:05.427 START TEST bdev_fio_rw_verify 00:09:05.427 ************************************ 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:09:05.427 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:05.691 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:05.691 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:05.691 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:05.691 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:05.691 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:05.691 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:05.691 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:05.691 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:05.691 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:05.691 00:03:52 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:05.953 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:05.953 fio-3.35 00:09:05.953 Starting 16 threads 00:09:18.149 00:09:18.149 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=3473438: Tue Jul 16 00:04:03 2024 00:09:18.149 read: IOPS=83.5k, BW=326MiB/s (342MB/s)(3260MiB/10001msec) 00:09:18.149 slat (usec): min=3, max=215, avg=38.48, stdev=14.94 00:09:18.149 clat (usec): min=10, max=1509, avg=310.95, stdev=143.71 00:09:18.149 lat (usec): min=21, max=1524, avg=349.43, stdev=153.27 00:09:18.149 clat percentiles (usec): 00:09:18.149 | 50.000th=[ 310], 99.000th=[ 611], 99.900th=[ 758], 99.990th=[ 930], 00:09:18.149 | 99.999th=[ 1045] 00:09:18.149 write: IOPS=130k, BW=510MiB/s (534MB/s)(5036MiB/9880msec); 0 zone resets 00:09:18.149 slat (usec): min=6, max=786, avg=52.93, stdev=17.15 00:09:18.149 clat (usec): min=5, max=4491, avg=371.94, stdev=174.56 00:09:18.149 lat (usec): min=20, max=4544, avg=424.86, stdev=184.71 00:09:18.149 clat percentiles (usec): 00:09:18.149 | 50.000th=[ 363], 99.000th=[ 848], 99.900th=[ 988], 99.990th=[ 1057], 00:09:18.149 | 99.999th=[ 1483] 00:09:18.149 bw ( KiB/s): min=439920, max=688791, per=99.17%, avg=517641.63, stdev=3978.92, samples=304 00:09:18.149 iops : min=109980, max=172195, avg=129410.21, stdev=994.70, samples=304 00:09:18.149 lat (usec) : 10=0.01%, 20=0.01%, 50=0.67%, 100=5.05%, 250=25.72% 00:09:18.149 lat (usec) : 500=49.69%, 750=17.57%, 1000=1.23% 00:09:18.149 lat (msec) : 2=0.05%, 10=0.01% 00:09:18.149 cpu : usr=99.18%, sys=0.37%, ctx=665, majf=0, minf=2678 00:09:18.149 IO depths : 1=12.6%, 2=25.1%, 4=49.9%, 8=12.4%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:18.149 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:18.149 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:18.149 issued rwts: total=834632,1289264,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:18.149 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:18.149 00:09:18.149 Run status group 0 (all jobs): 00:09:18.149 READ: bw=326MiB/s (342MB/s), 326MiB/s-326MiB/s (342MB/s-342MB/s), io=3260MiB (3419MB), run=10001-10001msec 00:09:18.149 WRITE: bw=510MiB/s (534MB/s), 510MiB/s-510MiB/s (534MB/s-534MB/s), io=5036MiB (5281MB), run=9880-9880msec 00:09:18.149 00:09:18.149 real 0m12.045s 00:09:18.149 user 2m45.770s 00:09:18.149 sys 0m1.396s 00:09:18.149 00:04:04 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:18.149 00:04:04 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:18.149 ************************************ 00:09:18.149 END TEST bdev_fio_rw_verify 00:09:18.149 ************************************ 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:18.149 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:18.150 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b7064d7a-6d80-4433-b407-3c52bb1af1d3"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b7064d7a-6d80-4433-b407-3c52bb1af1d3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "20a694cb-5588-5c28-80c4-88022e3f7f16"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "20a694cb-5588-5c28-80c4-88022e3f7f16",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "ce7071ea-2df7-5648-995e-888a75e33ef2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ce7071ea-2df7-5648-995e-888a75e33ef2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "1f8dc7e7-e82f-58b2-b752-65abf418c44d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1f8dc7e7-e82f-58b2-b752-65abf418c44d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "19baed69-c599-577a-97ef-8fe66fb30933"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "19baed69-c599-577a-97ef-8fe66fb30933",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "b2e5445c-ea64-5692-8eca-12906562eb53"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b2e5445c-ea64-5692-8eca-12906562eb53",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "b55baa38-798a-579e-ad9d-864759268c10"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b55baa38-798a-579e-ad9d-864759268c10",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "633308a5-eafb-5da1-938e-4f1161c0883d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "633308a5-eafb-5da1-938e-4f1161c0883d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "37c7dfef-fe56-5f8c-875c-97997740e332"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "37c7dfef-fe56-5f8c-875c-97997740e332",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "5bcd7c70-6470-5e07-970e-7ebdb0bdc7ad"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5bcd7c70-6470-5e07-970e-7ebdb0bdc7ad",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "6c08e111-b2b2-5d4c-949d-bd8a9ce9b031"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6c08e111-b2b2-5d4c-949d-bd8a9ce9b031",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "69428160-974e-5146-9be4-638e29732124"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "69428160-974e-5146-9be4-638e29732124",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "4bc9a8dc-4e63-4997-826f-62ba65ba63f4"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4bc9a8dc-4e63-4997-826f-62ba65ba63f4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4bc9a8dc-4e63-4997-826f-62ba65ba63f4",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "17b3377e-f901-4329-baec-1906eee56a81",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "c9cc1252-b8cc-4e59-972e-d95ba75770e8",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "607f43bf-b16b-4e2c-8311-57edeea5ee64"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "607f43bf-b16b-4e2c-8311-57edeea5ee64",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "607f43bf-b16b-4e2c-8311-57edeea5ee64",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "529be16d-dce3-42a3-a4ef-c7cc41cd5e9e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "7546c3f6-87ed-40e0-88f2-8ecea8814b4b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "55d230ba-3642-45ac-9d1b-64762b7993cc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "55d230ba-3642-45ac-9d1b-64762b7993cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "55d230ba-3642-45ac-9d1b-64762b7993cc",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "68cf0389-b627-45c9-8e16-83bcffa8e7c8",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "cfc54b4b-59b2-455a-83e8-291dee2f7855",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "761a79f9-0c14-4cbf-b270-59d4f98661c1"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "761a79f9-0c14-4cbf-b270-59d4f98661c1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:18.150 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:09:18.150 Malloc1p0 00:09:18.150 Malloc1p1 00:09:18.150 Malloc2p0 00:09:18.150 Malloc2p1 00:09:18.150 Malloc2p2 00:09:18.150 Malloc2p3 00:09:18.150 Malloc2p4 00:09:18.150 Malloc2p5 00:09:18.150 Malloc2p6 00:09:18.150 Malloc2p7 00:09:18.150 TestPT 00:09:18.150 raid0 00:09:18.150 concat0 ]] 00:09:18.150 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b7064d7a-6d80-4433-b407-3c52bb1af1d3"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b7064d7a-6d80-4433-b407-3c52bb1af1d3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "20a694cb-5588-5c28-80c4-88022e3f7f16"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "20a694cb-5588-5c28-80c4-88022e3f7f16",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "ce7071ea-2df7-5648-995e-888a75e33ef2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ce7071ea-2df7-5648-995e-888a75e33ef2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "1f8dc7e7-e82f-58b2-b752-65abf418c44d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1f8dc7e7-e82f-58b2-b752-65abf418c44d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "19baed69-c599-577a-97ef-8fe66fb30933"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "19baed69-c599-577a-97ef-8fe66fb30933",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "b2e5445c-ea64-5692-8eca-12906562eb53"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b2e5445c-ea64-5692-8eca-12906562eb53",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "b55baa38-798a-579e-ad9d-864759268c10"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b55baa38-798a-579e-ad9d-864759268c10",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "633308a5-eafb-5da1-938e-4f1161c0883d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "633308a5-eafb-5da1-938e-4f1161c0883d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "37c7dfef-fe56-5f8c-875c-97997740e332"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "37c7dfef-fe56-5f8c-875c-97997740e332",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "5bcd7c70-6470-5e07-970e-7ebdb0bdc7ad"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5bcd7c70-6470-5e07-970e-7ebdb0bdc7ad",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "6c08e111-b2b2-5d4c-949d-bd8a9ce9b031"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6c08e111-b2b2-5d4c-949d-bd8a9ce9b031",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "69428160-974e-5146-9be4-638e29732124"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "69428160-974e-5146-9be4-638e29732124",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "4bc9a8dc-4e63-4997-826f-62ba65ba63f4"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4bc9a8dc-4e63-4997-826f-62ba65ba63f4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4bc9a8dc-4e63-4997-826f-62ba65ba63f4",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "17b3377e-f901-4329-baec-1906eee56a81",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "c9cc1252-b8cc-4e59-972e-d95ba75770e8",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "607f43bf-b16b-4e2c-8311-57edeea5ee64"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "607f43bf-b16b-4e2c-8311-57edeea5ee64",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "607f43bf-b16b-4e2c-8311-57edeea5ee64",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "529be16d-dce3-42a3-a4ef-c7cc41cd5e9e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "7546c3f6-87ed-40e0-88f2-8ecea8814b4b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "55d230ba-3642-45ac-9d1b-64762b7993cc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "55d230ba-3642-45ac-9d1b-64762b7993cc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "55d230ba-3642-45ac-9d1b-64762b7993cc",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "68cf0389-b627-45c9-8e16-83bcffa8e7c8",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "cfc54b4b-59b2-455a-83e8-291dee2f7855",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "761a79f9-0c14-4cbf-b270-59d4f98661c1"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "761a79f9-0c14-4cbf-b270-59d4f98661c1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:18.152 00:04:04 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:18.152 ************************************ 00:09:18.152 START TEST bdev_fio_trim 00:09:18.152 ************************************ 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:18.152 00:04:04 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:18.152 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:18.152 fio-3.35 00:09:18.152 Starting 14 threads 00:09:30.357 00:09:30.357 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=3475198: Tue Jul 16 00:04:15 2024 00:09:30.357 write: IOPS=122k, BW=476MiB/s (499MB/s)(4760MiB/10001msec); 0 zone resets 00:09:30.357 slat (usec): min=4, max=204, avg=40.91, stdev=10.94 00:09:30.357 clat (usec): min=11, max=4569, avg=288.22, stdev=97.70 00:09:30.357 lat (usec): min=24, max=4595, avg=329.13, stdev=101.62 00:09:30.357 clat percentiles (usec): 00:09:30.357 | 50.000th=[ 281], 99.000th=[ 506], 99.900th=[ 586], 99.990th=[ 816], 00:09:30.357 | 99.999th=[ 1631] 00:09:30.357 bw ( KiB/s): min=422872, max=594040, per=100.00%, avg=488632.47, stdev=2643.84, samples=266 00:09:30.357 iops : min=105717, max=148509, avg=122157.58, stdev=660.96, samples=266 00:09:30.357 trim: IOPS=122k, BW=476MiB/s (499MB/s)(4760MiB/10001msec); 0 zone resets 00:09:30.357 slat (usec): min=5, max=501, avg=27.42, stdev= 7.22 00:09:30.357 clat (usec): min=20, max=4596, avg=324.29, stdev=107.75 00:09:30.357 lat (usec): min=32, max=4624, avg=351.71, stdev=110.84 00:09:30.357 clat percentiles (usec): 00:09:30.357 | 50.000th=[ 322], 99.000th=[ 553], 99.900th=[ 644], 99.990th=[ 889], 00:09:30.357 | 99.999th=[ 1680] 00:09:30.357 bw ( KiB/s): min=422872, max=594048, per=100.00%, avg=488632.89, stdev=2643.96, samples=266 00:09:30.357 iops : min=105717, max=148511, avg=122157.68, stdev=660.99, samples=266 00:09:30.357 lat (usec) : 20=0.01%, 50=0.02%, 100=0.77%, 250=32.64%, 500=63.70% 00:09:30.357 lat (usec) : 750=2.84%, 1000=0.02% 00:09:30.357 lat (msec) : 2=0.01%, 4=0.01%, 10=0.01% 00:09:30.357 cpu : usr=99.59%, sys=0.00%, ctx=527, majf=0, minf=1204 00:09:30.357 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:30.357 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:30.357 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:30.357 issued rwts: total=0,1218453,1218454,0 short=0,0,0,0 dropped=0,0,0,0 00:09:30.357 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:30.357 00:09:30.357 Run status group 0 (all jobs): 00:09:30.357 WRITE: bw=476MiB/s (499MB/s), 476MiB/s-476MiB/s (499MB/s-499MB/s), io=4760MiB (4991MB), run=10001-10001msec 00:09:30.357 TRIM: bw=476MiB/s (499MB/s), 476MiB/s-476MiB/s (499MB/s-499MB/s), io=4760MiB (4991MB), run=10001-10001msec 00:09:30.357 00:09:30.357 real 0m11.686s 00:09:30.357 user 2m25.815s 00:09:30.357 sys 0m0.822s 00:09:30.357 00:04:16 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:30.357 00:04:16 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:30.357 ************************************ 00:09:30.357 END TEST bdev_fio_trim 00:09:30.357 ************************************ 00:09:30.357 00:04:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:30.357 00:04:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:30.357 00:04:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:30.357 00:04:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:30.357 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:30.357 00:04:16 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:30.357 00:09:30.357 real 0m24.131s 00:09:30.357 user 5m11.795s 00:09:30.357 sys 0m2.441s 00:09:30.357 00:04:16 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:30.357 00:04:16 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:30.357 ************************************ 00:09:30.357 END TEST bdev_fio 00:09:30.357 ************************************ 00:09:30.357 00:04:16 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:30.357 00:04:16 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:30.357 00:04:16 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:30.357 00:04:16 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:30.357 00:04:16 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:30.357 00:04:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:30.357 ************************************ 00:09:30.357 START TEST bdev_verify 00:09:30.357 ************************************ 00:09:30.357 00:04:16 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:30.357 [2024-07-16 00:04:16.485805] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:09:30.357 [2024-07-16 00:04:16.485875] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3476692 ] 00:09:30.357 [2024-07-16 00:04:16.668825] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:30.357 [2024-07-16 00:04:16.774758] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:30.357 [2024-07-16 00:04:16.774763] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:30.357 [2024-07-16 00:04:16.930581] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:30.357 [2024-07-16 00:04:16.930643] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:30.357 [2024-07-16 00:04:16.930657] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:30.357 [2024-07-16 00:04:16.938590] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:30.357 [2024-07-16 00:04:16.938618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:30.357 [2024-07-16 00:04:16.946604] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:30.357 [2024-07-16 00:04:16.946636] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:30.357 [2024-07-16 00:04:17.023770] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:30.357 [2024-07-16 00:04:17.023823] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:30.357 [2024-07-16 00:04:17.023842] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d464d0 00:09:30.357 [2024-07-16 00:04:17.023855] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:30.357 [2024-07-16 00:04:17.025511] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:30.357 [2024-07-16 00:04:17.025541] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:30.357 Running I/O for 5 seconds... 00:09:35.664 00:09:35.664 Latency(us) 00:09:35.664 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:35.664 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x0 length 0x1000 00:09:35.664 Malloc0 : 5.16 1141.69 4.46 0.00 0.00 111890.84 577.00 361074.87 00:09:35.664 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x1000 length 0x1000 00:09:35.664 Malloc0 : 5.19 912.14 3.56 0.00 0.00 140011.93 701.66 419430.40 00:09:35.664 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x0 length 0x800 00:09:35.664 Malloc1p0 : 5.16 595.42 2.33 0.00 0.00 213967.27 2521.71 181449.24 00:09:35.664 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x800 length 0x800 00:09:35.664 Malloc1p0 : 5.25 487.84 1.91 0.00 0.00 260993.05 3219.81 227039.50 00:09:35.664 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x0 length 0x800 00:09:35.664 Malloc1p1 : 5.16 595.17 2.32 0.00 0.00 213607.83 2493.22 180537.43 00:09:35.664 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x800 length 0x800 00:09:35.664 Malloc1p1 : 5.25 487.61 1.90 0.00 0.00 260467.85 3191.32 227039.50 00:09:35.664 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x0 length 0x200 00:09:35.664 Malloc2p0 : 5.16 594.93 2.32 0.00 0.00 213222.39 2507.46 180537.43 00:09:35.664 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x200 length 0x200 00:09:35.664 Malloc2p0 : 5.25 487.38 1.90 0.00 0.00 259945.56 3875.17 227039.50 00:09:35.664 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x0 length 0x200 00:09:35.664 Malloc2p1 : 5.17 594.69 2.32 0.00 0.00 212861.70 3291.05 177802.02 00:09:35.664 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x200 length 0x200 00:09:35.664 Malloc2p1 : 5.26 487.15 1.90 0.00 0.00 259265.21 4103.12 223392.28 00:09:35.664 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x0 length 0x200 00:09:35.664 Malloc2p2 : 5.17 594.44 2.32 0.00 0.00 212399.85 3547.49 173242.99 00:09:35.664 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x200 length 0x200 00:09:35.664 Malloc2p2 : 5.26 486.92 1.90 0.00 0.00 258521.19 3376.53 220656.86 00:09:35.664 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x0 length 0x200 00:09:35.664 Malloc2p3 : 5.17 594.20 2.32 0.00 0.00 211909.98 2721.17 171419.38 00:09:35.664 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x200 length 0x200 00:09:35.664 Malloc2p3 : 5.26 486.69 1.90 0.00 0.00 257958.33 3191.32 219745.06 00:09:35.664 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x0 length 0x200 00:09:35.664 Malloc2p4 : 5.17 593.95 2.32 0.00 0.00 211522.75 2507.46 172331.19 00:09:35.664 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.664 Verification LBA range: start 0x200 length 0x200 00:09:35.664 Malloc2p4 : 5.26 486.46 1.90 0.00 0.00 257430.17 3704.21 219745.06 00:09:35.923 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.923 Verification LBA range: start 0x0 length 0x200 00:09:35.923 Malloc2p5 : 5.17 593.70 2.32 0.00 0.00 211148.90 2521.71 174154.80 00:09:35.923 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.923 Verification LBA range: start 0x200 length 0x200 00:09:35.923 Malloc2p5 : 5.27 486.23 1.90 0.00 0.00 256792.78 4245.59 218833.25 00:09:35.923 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.923 Verification LBA range: start 0x0 length 0x200 00:09:35.923 Malloc2p6 : 5.18 593.45 2.32 0.00 0.00 210766.08 3134.33 173242.99 00:09:35.923 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.923 Verification LBA range: start 0x200 length 0x200 00:09:35.923 Malloc2p6 : 5.27 486.00 1.90 0.00 0.00 256062.42 3006.11 217009.64 00:09:35.923 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.923 Verification LBA range: start 0x0 length 0x200 00:09:35.923 Malloc2p7 : 5.18 593.21 2.32 0.00 0.00 210336.36 3704.21 169595.77 00:09:35.923 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.923 Verification LBA range: start 0x200 length 0x200 00:09:35.923 Malloc2p7 : 5.27 485.77 1.90 0.00 0.00 255438.54 3647.22 210627.01 00:09:35.923 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.923 Verification LBA range: start 0x0 length 0x1000 00:09:35.923 TestPT : 5.23 592.63 2.31 0.00 0.00 209500.00 12480.33 168683.97 00:09:35.923 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.923 Verification LBA range: start 0x1000 length 0x1000 00:09:35.923 TestPT : 5.29 484.33 1.89 0.00 0.00 255399.16 11340.58 211538.81 00:09:35.923 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.923 Verification LBA range: start 0x0 length 0x2000 00:09:35.923 raid0 : 5.23 611.89 2.39 0.00 0.00 202797.90 3162.82 157742.30 00:09:35.923 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.924 Verification LBA range: start 0x2000 length 0x2000 00:09:35.924 raid0 : 5.28 485.25 1.90 0.00 0.00 254237.14 3276.80 195126.32 00:09:35.924 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.924 Verification LBA range: start 0x0 length 0x2000 00:09:35.924 concat0 : 5.23 611.65 2.39 0.00 0.00 202376.86 2336.50 158654.11 00:09:35.924 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.924 Verification LBA range: start 0x2000 length 0x2000 00:09:35.924 concat0 : 5.28 485.01 1.89 0.00 0.00 253603.41 3447.76 190567.29 00:09:35.924 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.924 Verification LBA range: start 0x0 length 0x1000 00:09:35.924 raid1 : 5.23 611.39 2.39 0.00 0.00 201985.73 3162.82 168683.97 00:09:35.924 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.924 Verification LBA range: start 0x1000 length 0x1000 00:09:35.924 raid1 : 5.28 484.77 1.89 0.00 0.00 252848.53 4188.61 197861.73 00:09:35.924 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:35.924 Verification LBA range: start 0x0 length 0x4e2 00:09:35.924 AIO0 : 5.24 611.21 2.39 0.00 0.00 201587.57 1460.31 177802.02 00:09:35.924 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:35.924 Verification LBA range: start 0x4e2 length 0x4e2 00:09:35.924 AIO0 : 5.28 484.59 1.89 0.00 0.00 252113.38 1652.65 202420.76 00:09:35.924 =================================================================================================================== 00:09:35.924 Total : 18327.75 71.59 0.00 0.00 218896.36 577.00 419430.40 00:09:36.182 00:09:36.182 real 0m6.582s 00:09:36.182 user 0m12.069s 00:09:36.182 sys 0m0.464s 00:09:36.182 00:04:23 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:36.182 00:04:23 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:36.182 ************************************ 00:09:36.182 END TEST bdev_verify 00:09:36.182 ************************************ 00:09:36.182 00:04:23 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:36.182 00:04:23 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:36.182 00:04:23 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:36.182 00:04:23 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:36.182 00:04:23 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:36.182 ************************************ 00:09:36.182 START TEST bdev_verify_big_io 00:09:36.182 ************************************ 00:09:36.182 00:04:23 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:36.439 [2024-07-16 00:04:23.154665] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:09:36.439 [2024-07-16 00:04:23.154734] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3477551 ] 00:09:36.439 [2024-07-16 00:04:23.282999] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:36.439 [2024-07-16 00:04:23.381243] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:36.440 [2024-07-16 00:04:23.381249] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:36.698 [2024-07-16 00:04:23.542160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:36.698 [2024-07-16 00:04:23.542220] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:36.698 [2024-07-16 00:04:23.542235] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:36.698 [2024-07-16 00:04:23.550164] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:36.698 [2024-07-16 00:04:23.550193] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:36.698 [2024-07-16 00:04:23.558178] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:36.698 [2024-07-16 00:04:23.558203] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:36.698 [2024-07-16 00:04:23.634619] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:36.698 [2024-07-16 00:04:23.634672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:36.698 [2024-07-16 00:04:23.634691] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x112d4d0 00:09:36.698 [2024-07-16 00:04:23.634704] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:36.698 [2024-07-16 00:04:23.636331] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:36.698 [2024-07-16 00:04:23.636362] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:36.956 [2024-07-16 00:04:23.806889] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:36.956 [2024-07-16 00:04:23.808454] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:36.956 [2024-07-16 00:04:23.810642] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:36.956 [2024-07-16 00:04:23.812176] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:36.956 [2024-07-16 00:04:23.814368] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.815612] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.817316] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.819078] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.820193] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.821916] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.823047] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.824786] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.825785] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.827321] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.828260] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.829763] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:36.957 [2024-07-16 00:04:23.855109] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:36.957 [2024-07-16 00:04:23.857187] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:36.957 Running I/O for 5 seconds... 00:09:45.071 00:09:45.071 Latency(us) 00:09:45.071 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:45.071 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x100 00:09:45.071 Malloc0 : 5.85 131.20 8.20 0.00 0.00 955321.74 872.63 2509287.96 00:09:45.071 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x100 length 0x100 00:09:45.071 Malloc0 : 6.34 121.14 7.57 0.00 0.00 1032856.47 1125.51 2844832.28 00:09:45.071 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x80 00:09:45.071 Malloc1p0 : 6.33 77.12 4.82 0.00 0.00 1496903.88 2493.22 2946954.46 00:09:45.071 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x80 length 0x80 00:09:45.071 Malloc1p0 : 7.18 31.19 1.95 0.00 0.00 3708008.79 1923.34 6039797.76 00:09:45.071 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x80 00:09:45.071 Malloc1p1 : 6.66 33.62 2.10 0.00 0.00 3342740.65 1474.56 5572953.49 00:09:45.071 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x80 length 0x80 00:09:45.071 Malloc1p1 : 7.18 31.18 1.95 0.00 0.00 3565835.61 1852.10 5806375.62 00:09:45.071 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x20 00:09:45.071 Malloc2p0 : 6.33 22.75 1.42 0.00 0.00 1239126.56 601.93 2159154.75 00:09:45.071 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x20 length 0x20 00:09:45.071 Malloc2p0 : 6.56 19.52 1.22 0.00 0.00 1410628.10 758.65 2363399.12 00:09:45.071 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x20 00:09:45.071 Malloc2p1 : 6.33 22.75 1.42 0.00 0.00 1227566.02 626.87 2115388.10 00:09:45.071 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x20 length 0x20 00:09:45.071 Malloc2p1 : 6.56 19.52 1.22 0.00 0.00 1395193.81 780.02 2334221.36 00:09:45.071 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x20 00:09:45.071 Malloc2p2 : 6.33 22.74 1.42 0.00 0.00 1216972.14 616.18 2100799.22 00:09:45.071 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x20 length 0x20 00:09:45.071 Malloc2p2 : 6.56 19.51 1.22 0.00 0.00 1379553.63 790.71 2305043.59 00:09:45.071 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x20 00:09:45.071 Malloc2p3 : 6.33 22.74 1.42 0.00 0.00 1204514.99 619.74 2071621.45 00:09:45.071 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x20 length 0x20 00:09:45.071 Malloc2p3 : 6.56 19.51 1.22 0.00 0.00 1364383.62 790.71 2275865.82 00:09:45.071 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x20 00:09:45.071 Malloc2p4 : 6.33 22.73 1.42 0.00 0.00 1193697.42 637.55 2042443.69 00:09:45.071 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x20 length 0x20 00:09:45.071 Malloc2p4 : 6.56 19.50 1.22 0.00 0.00 1348609.60 780.02 2246688.06 00:09:45.071 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x20 00:09:45.071 Malloc2p5 : 6.34 22.73 1.42 0.00 0.00 1182054.09 637.55 2013265.92 00:09:45.071 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x20 length 0x20 00:09:45.071 Malloc2p5 : 6.57 19.49 1.22 0.00 0.00 1333641.56 772.90 2217510.29 00:09:45.071 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x20 00:09:45.071 Malloc2p6 : 6.34 22.72 1.42 0.00 0.00 1170974.98 633.99 1998677.04 00:09:45.071 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x20 length 0x20 00:09:45.071 Malloc2p6 : 6.57 19.49 1.22 0.00 0.00 1317979.34 769.34 2173743.64 00:09:45.071 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x20 00:09:45.071 Malloc2p7 : 6.34 22.72 1.42 0.00 0.00 1158721.78 658.92 1969499.27 00:09:45.071 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x20 length 0x20 00:09:45.071 Malloc2p7 : 6.71 21.47 1.34 0.00 0.00 1197338.01 780.02 2144565.87 00:09:45.071 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x100 00:09:45.071 TestPT : 6.56 36.87 2.30 0.00 0.00 2789232.88 116255.17 3938998.54 00:09:45.071 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x100 length 0x100 00:09:45.071 TestPT : 6.91 34.72 2.17 0.00 0.00 2843871.08 136770.78 3851465.24 00:09:45.071 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x200 00:09:45.071 raid0 : 6.94 39.19 2.45 0.00 0.00 2479482.97 1560.04 4755976.01 00:09:45.071 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x200 length 0x200 00:09:45.071 raid0 : 7.16 35.75 2.23 0.00 0.00 2674529.25 2008.82 4814331.55 00:09:45.071 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x200 00:09:45.071 concat0 : 6.90 44.05 2.75 0.00 0.00 2136964.57 1588.54 4551731.65 00:09:45.071 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x200 length 0x200 00:09:45.071 concat0 : 7.13 46.84 2.93 0.00 0.00 1986466.52 1951.83 4610087.18 00:09:45.071 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x100 00:09:45.071 raid1 : 6.94 59.92 3.75 0.00 0.00 1563803.16 2051.56 4376665.04 00:09:45.071 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x100 length 0x100 00:09:45.071 raid1 : 7.16 63.10 3.94 0.00 0.00 1410544.12 2578.70 4405842.81 00:09:45.071 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x0 length 0x4e 00:09:45.071 AIO0 : 7.02 62.99 3.94 0.00 0.00 882561.09 715.91 3122021.06 00:09:45.071 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:09:45.071 Verification LBA range: start 0x4e length 0x4e 00:09:45.071 AIO0 : 7.30 85.99 5.37 0.00 0.00 613272.26 455.90 3282498.78 00:09:45.071 =================================================================================================================== 00:09:45.071 Total : 1274.77 79.67 0.00 0.00 1593686.23 455.90 6039797.76 00:09:45.071 00:09:45.071 real 0m8.594s 00:09:45.071 user 0m16.185s 00:09:45.071 sys 0m0.438s 00:09:45.071 00:04:31 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:45.071 00:04:31 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:09:45.071 ************************************ 00:09:45.071 END TEST bdev_verify_big_io 00:09:45.071 ************************************ 00:09:45.071 00:04:31 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:45.071 00:04:31 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:45.071 00:04:31 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:45.071 00:04:31 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:45.071 00:04:31 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:45.071 ************************************ 00:09:45.071 START TEST bdev_write_zeroes 00:09:45.071 ************************************ 00:09:45.071 00:04:31 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:45.072 [2024-07-16 00:04:31.835657] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:09:45.072 [2024-07-16 00:04:31.835724] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3478724 ] 00:09:45.072 [2024-07-16 00:04:31.962447] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:45.330 [2024-07-16 00:04:32.060113] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:45.330 [2024-07-16 00:04:32.218912] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:45.330 [2024-07-16 00:04:32.218985] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:45.330 [2024-07-16 00:04:32.219000] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:45.330 [2024-07-16 00:04:32.226918] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:45.330 [2024-07-16 00:04:32.226954] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:45.330 [2024-07-16 00:04:32.234933] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:45.330 [2024-07-16 00:04:32.234958] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:45.589 [2024-07-16 00:04:32.311624] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:45.589 [2024-07-16 00:04:32.311678] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:45.589 [2024-07-16 00:04:32.311697] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbf2c10 00:09:45.589 [2024-07-16 00:04:32.311710] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:45.589 [2024-07-16 00:04:32.313250] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:45.589 [2024-07-16 00:04:32.313280] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:45.589 Running I/O for 1 seconds... 00:09:46.966 00:09:46.966 Latency(us) 00:09:46.966 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:46.966 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc0 : 1.03 4950.87 19.34 0.00 0.00 25837.06 676.73 43538.70 00:09:46.966 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc1p0 : 1.04 4943.75 19.31 0.00 0.00 25826.08 918.93 42626.89 00:09:46.966 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc1p1 : 1.04 4936.65 19.28 0.00 0.00 25807.26 911.81 41715.09 00:09:46.966 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc2p0 : 1.04 4929.50 19.26 0.00 0.00 25787.62 911.81 40803.28 00:09:46.966 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc2p1 : 1.04 4922.46 19.23 0.00 0.00 25764.57 918.93 39891.48 00:09:46.966 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc2p2 : 1.04 4915.46 19.20 0.00 0.00 25743.50 911.81 38979.67 00:09:46.966 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc2p3 : 1.04 4908.39 19.17 0.00 0.00 25722.42 911.81 38067.87 00:09:46.966 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc2p4 : 1.04 4901.43 19.15 0.00 0.00 25702.67 911.81 37156.06 00:09:46.966 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc2p5 : 1.05 4894.46 19.12 0.00 0.00 25676.47 918.93 36244.26 00:09:46.966 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc2p6 : 1.06 4944.10 19.31 0.00 0.00 25369.30 908.24 35332.45 00:09:46.966 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 Malloc2p7 : 1.06 4937.17 19.29 0.00 0.00 25345.38 911.81 34420.65 00:09:46.966 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 TestPT : 1.06 4930.25 19.26 0.00 0.00 25323.99 947.42 33508.84 00:09:46.966 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 raid0 : 1.07 4922.28 19.23 0.00 0.00 25296.23 1638.40 31913.18 00:09:46.966 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 concat0 : 1.07 4914.46 19.20 0.00 0.00 25243.13 1617.03 30317.52 00:09:46.966 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 raid1 : 1.07 4904.65 19.16 0.00 0.00 25183.59 2592.95 27582.11 00:09:46.966 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:46.966 AIO0 : 1.07 4898.74 19.14 0.00 0.00 25092.07 1054.27 27126.21 00:09:46.966 =================================================================================================================== 00:09:46.966 Total : 78754.60 307.64 0.00 0.00 25542.05 676.73 43538.70 00:09:47.225 00:09:47.225 real 0m2.221s 00:09:47.225 user 0m1.827s 00:09:47.225 sys 0m0.346s 00:09:47.225 00:04:33 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:47.225 00:04:33 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:47.225 ************************************ 00:09:47.225 END TEST bdev_write_zeroes 00:09:47.225 ************************************ 00:09:47.225 00:04:34 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:47.225 00:04:34 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:47.225 00:04:34 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:47.225 00:04:34 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:47.225 00:04:34 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:47.225 ************************************ 00:09:47.225 START TEST bdev_json_nonenclosed 00:09:47.225 ************************************ 00:09:47.225 00:04:34 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:47.225 [2024-07-16 00:04:34.145951] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:09:47.225 [2024-07-16 00:04:34.146016] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3479042 ] 00:09:47.483 [2024-07-16 00:04:34.274433] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:47.483 [2024-07-16 00:04:34.374748] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:47.483 [2024-07-16 00:04:34.374821] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:47.483 [2024-07-16 00:04:34.374843] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:47.483 [2024-07-16 00:04:34.374856] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:47.742 00:09:47.742 real 0m0.392s 00:09:47.742 user 0m0.236s 00:09:47.742 sys 0m0.154s 00:09:47.742 00:04:34 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:09:47.742 00:04:34 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:47.742 00:04:34 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:47.742 ************************************ 00:09:47.742 END TEST bdev_json_nonenclosed 00:09:47.742 ************************************ 00:09:47.742 00:04:34 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:47.742 00:04:34 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:09:47.742 00:04:34 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:47.742 00:04:34 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:47.742 00:04:34 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:47.742 00:04:34 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:47.742 ************************************ 00:09:47.742 START TEST bdev_json_nonarray 00:09:47.742 ************************************ 00:09:47.742 00:04:34 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:47.742 [2024-07-16 00:04:34.628172] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:09:47.742 [2024-07-16 00:04:34.628233] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3479114 ] 00:09:48.001 [2024-07-16 00:04:34.754540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:48.001 [2024-07-16 00:04:34.851138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:48.001 [2024-07-16 00:04:34.851212] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:48.001 [2024-07-16 00:04:34.851234] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:48.001 [2024-07-16 00:04:34.851247] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:48.259 00:09:48.259 real 0m0.385s 00:09:48.259 user 0m0.234s 00:09:48.259 sys 0m0.149s 00:09:48.259 00:04:34 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:09:48.259 00:04:34 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:48.259 00:04:34 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:48.259 ************************************ 00:09:48.259 END TEST bdev_json_nonarray 00:09:48.259 ************************************ 00:09:48.259 00:04:34 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:48.259 00:04:34 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:09:48.259 00:04:34 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:48.259 00:04:34 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:48.259 00:04:34 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:48.259 00:04:34 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:48.259 00:04:34 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:48.259 ************************************ 00:09:48.259 START TEST bdev_qos 00:09:48.259 ************************************ 00:09:48.259 00:04:35 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:09:48.259 00:04:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=3479142 00:09:48.259 00:04:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 3479142' 00:09:48.259 Process qos testing pid: 3479142 00:09:48.259 00:04:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:48.259 00:04:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:48.259 00:04:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 3479142 00:09:48.259 00:04:35 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 3479142 ']' 00:09:48.259 00:04:35 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:48.259 00:04:35 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:48.259 00:04:35 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:48.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:48.260 00:04:35 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:48.260 00:04:35 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:48.260 [2024-07-16 00:04:35.104478] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:09:48.260 [2024-07-16 00:04:35.104548] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3479142 ] 00:09:48.518 [2024-07-16 00:04:35.240631] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:48.518 [2024-07-16 00:04:35.360280] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:49.455 Malloc_0 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.455 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:49.455 [ 00:09:49.455 { 00:09:49.455 "name": "Malloc_0", 00:09:49.455 "aliases": [ 00:09:49.455 "29e8f8c6-9ba2-4a86-9b83-b72ac8b53506" 00:09:49.455 ], 00:09:49.455 "product_name": "Malloc disk", 00:09:49.455 "block_size": 512, 00:09:49.455 "num_blocks": 262144, 00:09:49.455 "uuid": "29e8f8c6-9ba2-4a86-9b83-b72ac8b53506", 00:09:49.455 "assigned_rate_limits": { 00:09:49.455 "rw_ios_per_sec": 0, 00:09:49.455 "rw_mbytes_per_sec": 0, 00:09:49.455 "r_mbytes_per_sec": 0, 00:09:49.455 "w_mbytes_per_sec": 0 00:09:49.455 }, 00:09:49.455 "claimed": false, 00:09:49.455 "zoned": false, 00:09:49.455 "supported_io_types": { 00:09:49.455 "read": true, 00:09:49.455 "write": true, 00:09:49.455 "unmap": true, 00:09:49.456 "flush": true, 00:09:49.456 "reset": true, 00:09:49.456 "nvme_admin": false, 00:09:49.456 "nvme_io": false, 00:09:49.456 "nvme_io_md": false, 00:09:49.456 "write_zeroes": true, 00:09:49.456 "zcopy": true, 00:09:49.456 "get_zone_info": false, 00:09:49.456 "zone_management": false, 00:09:49.456 "zone_append": false, 00:09:49.456 "compare": false, 00:09:49.456 "compare_and_write": false, 00:09:49.456 "abort": true, 00:09:49.456 "seek_hole": false, 00:09:49.456 "seek_data": false, 00:09:49.456 "copy": true, 00:09:49.456 "nvme_iov_md": false 00:09:49.456 }, 00:09:49.456 "memory_domains": [ 00:09:49.456 { 00:09:49.456 "dma_device_id": "system", 00:09:49.456 "dma_device_type": 1 00:09:49.456 }, 00:09:49.456 { 00:09:49.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:49.456 "dma_device_type": 2 00:09:49.456 } 00:09:49.456 ], 00:09:49.456 "driver_specific": {} 00:09:49.456 } 00:09:49.456 ] 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:49.456 Null_1 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.456 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:49.715 [ 00:09:49.715 { 00:09:49.715 "name": "Null_1", 00:09:49.715 "aliases": [ 00:09:49.715 "2ee86719-b2cf-4fe8-b6af-41d627b064c5" 00:09:49.715 ], 00:09:49.715 "product_name": "Null disk", 00:09:49.715 "block_size": 512, 00:09:49.715 "num_blocks": 262144, 00:09:49.715 "uuid": "2ee86719-b2cf-4fe8-b6af-41d627b064c5", 00:09:49.715 "assigned_rate_limits": { 00:09:49.715 "rw_ios_per_sec": 0, 00:09:49.715 "rw_mbytes_per_sec": 0, 00:09:49.715 "r_mbytes_per_sec": 0, 00:09:49.715 "w_mbytes_per_sec": 0 00:09:49.715 }, 00:09:49.715 "claimed": false, 00:09:49.715 "zoned": false, 00:09:49.715 "supported_io_types": { 00:09:49.715 "read": true, 00:09:49.715 "write": true, 00:09:49.715 "unmap": false, 00:09:49.715 "flush": false, 00:09:49.715 "reset": true, 00:09:49.715 "nvme_admin": false, 00:09:49.715 "nvme_io": false, 00:09:49.715 "nvme_io_md": false, 00:09:49.715 "write_zeroes": true, 00:09:49.715 "zcopy": false, 00:09:49.715 "get_zone_info": false, 00:09:49.715 "zone_management": false, 00:09:49.715 "zone_append": false, 00:09:49.715 "compare": false, 00:09:49.715 "compare_and_write": false, 00:09:49.715 "abort": true, 00:09:49.715 "seek_hole": false, 00:09:49.715 "seek_data": false, 00:09:49.715 "copy": false, 00:09:49.715 "nvme_iov_md": false 00:09:49.715 }, 00:09:49.715 "driver_specific": {} 00:09:49.715 } 00:09:49.715 ] 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:49.715 00:04:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:49.715 Running I/O for 60 seconds... 00:09:54.983 00:04:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 48600.76 194403.02 0.00 0.00 195584.00 0.00 0.00 ' 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=48600.76 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 48600 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=48600 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=12000 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 12000 -gt 1000 ']' 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 12000 Malloc_0 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 12000 IOPS Malloc_0 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:54.984 00:04:41 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:54.984 ************************************ 00:09:54.984 START TEST bdev_qos_iops 00:09:54.984 ************************************ 00:09:54.984 00:04:41 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 12000 IOPS Malloc_0 00:09:54.984 00:04:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=12000 00:09:54.984 00:04:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:54.984 00:04:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:09:54.984 00:04:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:54.984 00:04:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:54.984 00:04:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:54.984 00:04:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:54.984 00:04:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:54.984 00:04:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 11998.48 47993.92 0.00 0.00 49440.00 0.00 0.00 ' 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=11998.48 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 11998 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=11998 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=10800 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=13200 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 11998 -lt 10800 ']' 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 11998 -gt 13200 ']' 00:10:00.254 00:10:00.254 real 0m5.299s 00:10:00.254 user 0m0.114s 00:10:00.254 sys 0m0.050s 00:10:00.254 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:00.255 00:04:46 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:10:00.255 ************************************ 00:10:00.255 END TEST bdev_qos_iops 00:10:00.255 ************************************ 00:10:00.255 00:04:47 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:00.255 00:04:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:10:00.255 00:04:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:00.255 00:04:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:00.255 00:04:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:00.255 00:04:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:00.255 00:04:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:00.255 00:04:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 15383.16 61532.65 0.00 0.00 63488.00 0.00 0.00 ' 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=63488.00 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 63488 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=63488 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=6 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 6 -lt 2 ']' 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 6 Null_1 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 6 BANDWIDTH Null_1 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:05.584 00:04:52 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:05.584 ************************************ 00:10:05.584 START TEST bdev_qos_bw 00:10:05.584 ************************************ 00:10:05.584 00:04:52 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 6 BANDWIDTH Null_1 00:10:05.584 00:04:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=6 00:10:05.584 00:04:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:05.584 00:04:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:10:05.584 00:04:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:05.584 00:04:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:05.584 00:04:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:05.584 00:04:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:05.584 00:04:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:05.584 00:04:52 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 1536.93 6147.71 0.00 0.00 6344.00 0.00 0.00 ' 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=6344.00 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 6344 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=6344 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=6144 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=5529 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=6758 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 6344 -lt 5529 ']' 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 6344 -gt 6758 ']' 00:10:10.859 00:10:10.859 real 0m5.323s 00:10:10.859 user 0m0.118s 00:10:10.859 sys 0m0.048s 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:10.859 00:04:57 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:10:10.859 ************************************ 00:10:10.859 END TEST bdev_qos_bw 00:10:10.859 ************************************ 00:10:10.859 00:04:57 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:10.859 00:04:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:10:10.859 00:04:57 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:10.859 00:04:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:11.117 00:04:57 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:11.117 00:04:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:10:11.117 00:04:57 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:11.117 00:04:57 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:11.117 00:04:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:11.117 ************************************ 00:10:11.117 START TEST bdev_qos_ro_bw 00:10:11.117 ************************************ 00:10:11.117 00:04:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:10:11.117 00:04:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:10:11.117 00:04:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:11.117 00:04:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:10:11.117 00:04:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:11.117 00:04:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:11.117 00:04:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:11.117 00:04:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:11.117 00:04:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:11.117 00:04:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.30 2045.19 0.00 0.00 2052.00 0.00 0.00 ' 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2052.00 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2052 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2052 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -lt 1843 ']' 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -gt 2252 ']' 00:10:16.386 00:10:16.386 real 0m5.192s 00:10:16.386 user 0m0.121s 00:10:16.386 sys 0m0.046s 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:16.386 00:05:03 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:16.386 ************************************ 00:10:16.386 END TEST bdev_qos_ro_bw 00:10:16.386 ************************************ 00:10:16.386 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:16.386 00:05:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:16.386 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:16.386 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:16.954 00:10:16.954 Latency(us) 00:10:16.954 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:16.954 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:16.954 Malloc_0 : 27.03 16409.21 64.10 0.00 0.00 15455.68 2550.21 503316.48 00:10:16.954 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:16.954 Null_1 : 27.23 15763.38 61.58 0.00 0.00 16185.55 1011.53 197861.73 00:10:16.954 =================================================================================================================== 00:10:16.954 Total : 32172.60 125.67 0.00 0.00 15814.63 1011.53 503316.48 00:10:16.954 0 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 3479142 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 3479142 ']' 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 3479142 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:16.954 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3479142 00:10:17.213 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:17.213 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:17.213 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3479142' 00:10:17.213 killing process with pid 3479142 00:10:17.213 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 3479142 00:10:17.213 Received shutdown signal, test time was about 27.295134 seconds 00:10:17.213 00:10:17.213 Latency(us) 00:10:17.213 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:17.213 =================================================================================================================== 00:10:17.213 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:17.213 00:05:03 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 3479142 00:10:17.472 00:05:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:10:17.472 00:10:17.472 real 0m29.156s 00:10:17.472 user 0m30.085s 00:10:17.472 sys 0m1.001s 00:10:17.472 00:05:04 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:17.473 00:05:04 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:17.473 ************************************ 00:10:17.473 END TEST bdev_qos 00:10:17.473 ************************************ 00:10:17.473 00:05:04 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:17.473 00:05:04 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:17.473 00:05:04 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:17.473 00:05:04 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:17.473 00:05:04 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:17.473 ************************************ 00:10:17.473 START TEST bdev_qd_sampling 00:10:17.473 ************************************ 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=3483082 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 3483082' 00:10:17.473 Process bdev QD sampling period testing pid: 3483082 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 3483082 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 3483082 ']' 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:17.473 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:17.473 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:17.473 [2024-07-16 00:05:04.349082] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:10:17.473 [2024-07-16 00:05:04.349155] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3483082 ] 00:10:17.731 [2024-07-16 00:05:04.469690] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:17.731 [2024-07-16 00:05:04.579126] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:17.731 [2024-07-16 00:05:04.579132] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:17.991 Malloc_QD 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:17.991 [ 00:10:17.991 { 00:10:17.991 "name": "Malloc_QD", 00:10:17.991 "aliases": [ 00:10:17.991 "6a8c76cc-c5b4-43b0-a04e-019d15bc3168" 00:10:17.991 ], 00:10:17.991 "product_name": "Malloc disk", 00:10:17.991 "block_size": 512, 00:10:17.991 "num_blocks": 262144, 00:10:17.991 "uuid": "6a8c76cc-c5b4-43b0-a04e-019d15bc3168", 00:10:17.991 "assigned_rate_limits": { 00:10:17.991 "rw_ios_per_sec": 0, 00:10:17.991 "rw_mbytes_per_sec": 0, 00:10:17.991 "r_mbytes_per_sec": 0, 00:10:17.991 "w_mbytes_per_sec": 0 00:10:17.991 }, 00:10:17.991 "claimed": false, 00:10:17.991 "zoned": false, 00:10:17.991 "supported_io_types": { 00:10:17.991 "read": true, 00:10:17.991 "write": true, 00:10:17.991 "unmap": true, 00:10:17.991 "flush": true, 00:10:17.991 "reset": true, 00:10:17.991 "nvme_admin": false, 00:10:17.991 "nvme_io": false, 00:10:17.991 "nvme_io_md": false, 00:10:17.991 "write_zeroes": true, 00:10:17.991 "zcopy": true, 00:10:17.991 "get_zone_info": false, 00:10:17.991 "zone_management": false, 00:10:17.991 "zone_append": false, 00:10:17.991 "compare": false, 00:10:17.991 "compare_and_write": false, 00:10:17.991 "abort": true, 00:10:17.991 "seek_hole": false, 00:10:17.991 "seek_data": false, 00:10:17.991 "copy": true, 00:10:17.991 "nvme_iov_md": false 00:10:17.991 }, 00:10:17.991 "memory_domains": [ 00:10:17.991 { 00:10:17.991 "dma_device_id": "system", 00:10:17.991 "dma_device_type": 1 00:10:17.991 }, 00:10:17.991 { 00:10:17.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:17.991 "dma_device_type": 2 00:10:17.991 } 00:10:17.991 ], 00:10:17.991 "driver_specific": {} 00:10:17.991 } 00:10:17.991 ] 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:10:17.991 00:05:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:18.250 Running I/O for 5 seconds... 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:10:20.154 "tick_rate": 2300000000, 00:10:20.154 "ticks": 5447219009179648, 00:10:20.154 "bdevs": [ 00:10:20.154 { 00:10:20.154 "name": "Malloc_QD", 00:10:20.154 "bytes_read": 700494336, 00:10:20.154 "num_read_ops": 171012, 00:10:20.154 "bytes_written": 0, 00:10:20.154 "num_write_ops": 0, 00:10:20.154 "bytes_unmapped": 0, 00:10:20.154 "num_unmap_ops": 0, 00:10:20.154 "bytes_copied": 0, 00:10:20.154 "num_copy_ops": 0, 00:10:20.154 "read_latency_ticks": 2238067236302, 00:10:20.154 "max_read_latency_ticks": 17604646, 00:10:20.154 "min_read_latency_ticks": 269058, 00:10:20.154 "write_latency_ticks": 0, 00:10:20.154 "max_write_latency_ticks": 0, 00:10:20.154 "min_write_latency_ticks": 0, 00:10:20.154 "unmap_latency_ticks": 0, 00:10:20.154 "max_unmap_latency_ticks": 0, 00:10:20.154 "min_unmap_latency_ticks": 0, 00:10:20.154 "copy_latency_ticks": 0, 00:10:20.154 "max_copy_latency_ticks": 0, 00:10:20.154 "min_copy_latency_ticks": 0, 00:10:20.154 "io_error": {}, 00:10:20.154 "queue_depth_polling_period": 10, 00:10:20.154 "queue_depth": 512, 00:10:20.154 "io_time": 30, 00:10:20.154 "weighted_io_time": 15360 00:10:20.154 } 00:10:20.154 ] 00:10:20.154 }' 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:20.154 00:05:06 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:20.154 00:10:20.154 Latency(us) 00:10:20.154 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:20.154 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:20.154 Malloc_QD : 1.98 49501.57 193.37 0.00 0.00 5158.85 1396.20 5499.33 00:10:20.154 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:20.154 Malloc_QD : 1.98 40387.24 157.76 0.00 0.00 6321.97 1246.61 7693.36 00:10:20.154 =================================================================================================================== 00:10:20.154 Total : 89888.81 351.13 0.00 0.00 5681.92 1246.61 7693.36 00:10:20.154 0 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 3483082 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 3483082 ']' 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 3483082 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3483082 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3483082' 00:10:20.154 killing process with pid 3483082 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 3483082 00:10:20.154 Received shutdown signal, test time was about 2.062613 seconds 00:10:20.154 00:10:20.154 Latency(us) 00:10:20.154 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:20.154 =================================================================================================================== 00:10:20.154 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:20.154 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 3483082 00:10:20.412 00:05:07 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:10:20.413 00:10:20.413 real 0m2.992s 00:10:20.413 user 0m5.883s 00:10:20.413 sys 0m0.434s 00:10:20.413 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:20.413 00:05:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:20.413 ************************************ 00:10:20.413 END TEST bdev_qd_sampling 00:10:20.413 ************************************ 00:10:20.413 00:05:07 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:20.413 00:05:07 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:10:20.413 00:05:07 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:20.413 00:05:07 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:20.413 00:05:07 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:20.671 ************************************ 00:10:20.671 START TEST bdev_error 00:10:20.671 ************************************ 00:10:20.671 00:05:07 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:10:20.671 00:05:07 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:10:20.671 00:05:07 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:10:20.671 00:05:07 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:10:20.671 00:05:07 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=3483469 00:10:20.671 00:05:07 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 3483469' 00:10:20.671 Process error testing pid: 3483469 00:10:20.671 00:05:07 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:20.671 00:05:07 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 3483469 00:10:20.671 00:05:07 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 3483469 ']' 00:10:20.671 00:05:07 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:20.671 00:05:07 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:20.671 00:05:07 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:20.671 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:20.671 00:05:07 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:20.671 00:05:07 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:20.671 [2024-07-16 00:05:07.426034] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:10:20.671 [2024-07-16 00:05:07.426104] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3483469 ] 00:10:20.671 [2024-07-16 00:05:07.565846] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:20.929 [2024-07-16 00:05:07.702163] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:21.494 00:05:08 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:21.494 Dev_1 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:21.494 00:05:08 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:21.494 [ 00:10:21.494 { 00:10:21.494 "name": "Dev_1", 00:10:21.494 "aliases": [ 00:10:21.494 "b1fb5af8-8be3-4e2b-87b3-b2d93f20bbca" 00:10:21.494 ], 00:10:21.494 "product_name": "Malloc disk", 00:10:21.494 "block_size": 512, 00:10:21.494 "num_blocks": 262144, 00:10:21.494 "uuid": "b1fb5af8-8be3-4e2b-87b3-b2d93f20bbca", 00:10:21.494 "assigned_rate_limits": { 00:10:21.494 "rw_ios_per_sec": 0, 00:10:21.494 "rw_mbytes_per_sec": 0, 00:10:21.494 "r_mbytes_per_sec": 0, 00:10:21.494 "w_mbytes_per_sec": 0 00:10:21.494 }, 00:10:21.494 "claimed": false, 00:10:21.494 "zoned": false, 00:10:21.494 "supported_io_types": { 00:10:21.494 "read": true, 00:10:21.494 "write": true, 00:10:21.494 "unmap": true, 00:10:21.494 "flush": true, 00:10:21.494 "reset": true, 00:10:21.494 "nvme_admin": false, 00:10:21.494 "nvme_io": false, 00:10:21.494 "nvme_io_md": false, 00:10:21.494 "write_zeroes": true, 00:10:21.494 "zcopy": true, 00:10:21.494 "get_zone_info": false, 00:10:21.494 "zone_management": false, 00:10:21.494 "zone_append": false, 00:10:21.494 "compare": false, 00:10:21.494 "compare_and_write": false, 00:10:21.494 "abort": true, 00:10:21.494 "seek_hole": false, 00:10:21.494 "seek_data": false, 00:10:21.494 "copy": true, 00:10:21.494 "nvme_iov_md": false 00:10:21.494 }, 00:10:21.494 "memory_domains": [ 00:10:21.494 { 00:10:21.494 "dma_device_id": "system", 00:10:21.494 "dma_device_type": 1 00:10:21.494 }, 00:10:21.494 { 00:10:21.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.494 "dma_device_type": 2 00:10:21.494 } 00:10:21.494 ], 00:10:21.494 "driver_specific": {} 00:10:21.494 } 00:10:21.494 ] 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:21.494 00:05:08 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:21.494 true 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:21.494 00:05:08 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:21.494 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:21.752 Dev_2 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:21.752 00:05:08 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:21.752 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:21.752 [ 00:10:21.752 { 00:10:21.752 "name": "Dev_2", 00:10:21.752 "aliases": [ 00:10:21.752 "23ed4230-341c-4a3c-a56a-50a82673f4c9" 00:10:21.752 ], 00:10:21.752 "product_name": "Malloc disk", 00:10:21.752 "block_size": 512, 00:10:21.752 "num_blocks": 262144, 00:10:21.752 "uuid": "23ed4230-341c-4a3c-a56a-50a82673f4c9", 00:10:21.752 "assigned_rate_limits": { 00:10:21.752 "rw_ios_per_sec": 0, 00:10:21.752 "rw_mbytes_per_sec": 0, 00:10:21.752 "r_mbytes_per_sec": 0, 00:10:21.752 "w_mbytes_per_sec": 0 00:10:21.752 }, 00:10:21.752 "claimed": false, 00:10:21.752 "zoned": false, 00:10:21.752 "supported_io_types": { 00:10:21.752 "read": true, 00:10:21.752 "write": true, 00:10:21.752 "unmap": true, 00:10:21.752 "flush": true, 00:10:21.752 "reset": true, 00:10:21.752 "nvme_admin": false, 00:10:21.752 "nvme_io": false, 00:10:21.752 "nvme_io_md": false, 00:10:21.752 "write_zeroes": true, 00:10:21.752 "zcopy": true, 00:10:21.752 "get_zone_info": false, 00:10:21.752 "zone_management": false, 00:10:21.752 "zone_append": false, 00:10:21.752 "compare": false, 00:10:21.752 "compare_and_write": false, 00:10:21.752 "abort": true, 00:10:21.752 "seek_hole": false, 00:10:21.752 "seek_data": false, 00:10:21.752 "copy": true, 00:10:21.752 "nvme_iov_md": false 00:10:21.752 }, 00:10:21.752 "memory_domains": [ 00:10:21.752 { 00:10:21.752 "dma_device_id": "system", 00:10:21.752 "dma_device_type": 1 00:10:21.752 }, 00:10:21.752 { 00:10:21.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.752 "dma_device_type": 2 00:10:21.752 } 00:10:21.753 ], 00:10:21.753 "driver_specific": {} 00:10:21.753 } 00:10:21.753 ] 00:10:21.753 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:21.753 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:21.753 00:05:08 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:21.753 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:21.753 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:21.753 00:05:08 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:21.753 00:05:08 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:10:21.753 00:05:08 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:21.753 Running I/O for 5 seconds... 00:10:22.688 00:05:09 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 3483469 00:10:22.688 00:05:09 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 3483469' 00:10:22.688 Process is existed as continue on error is set. Pid: 3483469 00:10:22.688 00:05:09 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:22.688 00:05:09 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.688 00:05:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:22.688 00:05:09 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.688 00:05:09 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:22.688 00:05:09 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.688 00:05:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:22.688 00:05:09 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.688 00:05:09 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:10:22.688 Timeout while waiting for response: 00:10:22.688 00:10:22.688 00:10:26.881 00:10:26.881 Latency(us) 00:10:26.881 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:26.881 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:26.881 EE_Dev_1 : 0.89 29034.73 113.42 5.59 0.00 546.30 165.62 865.50 00:10:26.881 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:26.881 Dev_2 : 5.00 62649.61 244.73 0.00 0.00 250.87 85.04 29633.67 00:10:26.881 =================================================================================================================== 00:10:26.881 Total : 91684.35 358.14 5.59 0.00 273.49 85.04 29633.67 00:10:27.890 00:05:14 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 3483469 00:10:27.890 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 3483469 ']' 00:10:27.890 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 3483469 00:10:27.890 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:10:27.890 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:27.890 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3483469 00:10:27.890 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:27.890 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:27.890 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3483469' 00:10:27.890 killing process with pid 3483469 00:10:27.890 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 3483469 00:10:27.890 Received shutdown signal, test time was about 5.000000 seconds 00:10:27.890 00:10:27.890 Latency(us) 00:10:27.890 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:27.890 =================================================================================================================== 00:10:27.890 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:27.890 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 3483469 00:10:28.150 00:05:14 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=3484472 00:10:28.150 00:05:14 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 3484472' 00:10:28.150 Process error testing pid: 3484472 00:10:28.150 00:05:14 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:28.150 00:05:14 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 3484472 00:10:28.150 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 3484472 ']' 00:10:28.150 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:28.150 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:28.150 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:28.150 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:28.150 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:28.150 00:05:14 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:28.150 [2024-07-16 00:05:15.025831] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:10:28.150 [2024-07-16 00:05:15.025906] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3484472 ] 00:10:28.410 [2024-07-16 00:05:15.163699] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:28.410 [2024-07-16 00:05:15.296189] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:29.350 00:05:15 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:29.350 00:05:15 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:29.350 00:05:15 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:29.350 00:05:15 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.350 00:05:15 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.350 Dev_1 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.350 00:05:16 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.350 [ 00:10:29.350 { 00:10:29.350 "name": "Dev_1", 00:10:29.350 "aliases": [ 00:10:29.350 "c3650df9-9ea5-4097-acd6-b489b443c3a1" 00:10:29.350 ], 00:10:29.350 "product_name": "Malloc disk", 00:10:29.350 "block_size": 512, 00:10:29.350 "num_blocks": 262144, 00:10:29.350 "uuid": "c3650df9-9ea5-4097-acd6-b489b443c3a1", 00:10:29.350 "assigned_rate_limits": { 00:10:29.350 "rw_ios_per_sec": 0, 00:10:29.350 "rw_mbytes_per_sec": 0, 00:10:29.350 "r_mbytes_per_sec": 0, 00:10:29.350 "w_mbytes_per_sec": 0 00:10:29.350 }, 00:10:29.350 "claimed": false, 00:10:29.350 "zoned": false, 00:10:29.350 "supported_io_types": { 00:10:29.350 "read": true, 00:10:29.350 "write": true, 00:10:29.350 "unmap": true, 00:10:29.350 "flush": true, 00:10:29.350 "reset": true, 00:10:29.350 "nvme_admin": false, 00:10:29.350 "nvme_io": false, 00:10:29.350 "nvme_io_md": false, 00:10:29.350 "write_zeroes": true, 00:10:29.350 "zcopy": true, 00:10:29.350 "get_zone_info": false, 00:10:29.350 "zone_management": false, 00:10:29.350 "zone_append": false, 00:10:29.350 "compare": false, 00:10:29.350 "compare_and_write": false, 00:10:29.350 "abort": true, 00:10:29.350 "seek_hole": false, 00:10:29.350 "seek_data": false, 00:10:29.350 "copy": true, 00:10:29.350 "nvme_iov_md": false 00:10:29.350 }, 00:10:29.350 "memory_domains": [ 00:10:29.350 { 00:10:29.350 "dma_device_id": "system", 00:10:29.350 "dma_device_type": 1 00:10:29.350 }, 00:10:29.350 { 00:10:29.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:29.350 "dma_device_type": 2 00:10:29.350 } 00:10:29.350 ], 00:10:29.350 "driver_specific": {} 00:10:29.350 } 00:10:29.350 ] 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:29.350 00:05:16 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.350 true 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.350 00:05:16 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.350 Dev_2 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.350 00:05:16 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.350 [ 00:10:29.350 { 00:10:29.350 "name": "Dev_2", 00:10:29.350 "aliases": [ 00:10:29.350 "43fc34a6-4f3d-4b0f-9139-35feb71ea174" 00:10:29.350 ], 00:10:29.350 "product_name": "Malloc disk", 00:10:29.350 "block_size": 512, 00:10:29.350 "num_blocks": 262144, 00:10:29.350 "uuid": "43fc34a6-4f3d-4b0f-9139-35feb71ea174", 00:10:29.350 "assigned_rate_limits": { 00:10:29.350 "rw_ios_per_sec": 0, 00:10:29.350 "rw_mbytes_per_sec": 0, 00:10:29.350 "r_mbytes_per_sec": 0, 00:10:29.350 "w_mbytes_per_sec": 0 00:10:29.350 }, 00:10:29.350 "claimed": false, 00:10:29.350 "zoned": false, 00:10:29.350 "supported_io_types": { 00:10:29.350 "read": true, 00:10:29.350 "write": true, 00:10:29.350 "unmap": true, 00:10:29.350 "flush": true, 00:10:29.350 "reset": true, 00:10:29.350 "nvme_admin": false, 00:10:29.350 "nvme_io": false, 00:10:29.350 "nvme_io_md": false, 00:10:29.350 "write_zeroes": true, 00:10:29.350 "zcopy": true, 00:10:29.350 "get_zone_info": false, 00:10:29.350 "zone_management": false, 00:10:29.350 "zone_append": false, 00:10:29.350 "compare": false, 00:10:29.350 "compare_and_write": false, 00:10:29.350 "abort": true, 00:10:29.350 "seek_hole": false, 00:10:29.350 "seek_data": false, 00:10:29.350 "copy": true, 00:10:29.350 "nvme_iov_md": false 00:10:29.350 }, 00:10:29.350 "memory_domains": [ 00:10:29.350 { 00:10:29.350 "dma_device_id": "system", 00:10:29.350 "dma_device_type": 1 00:10:29.350 }, 00:10:29.350 { 00:10:29.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:29.350 "dma_device_type": 2 00:10:29.350 } 00:10:29.350 ], 00:10:29.350 "driver_specific": {} 00:10:29.350 } 00:10:29.350 ] 00:10:29.350 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:29.351 00:05:16 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.351 00:05:16 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 3484472 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 3484472 00:10:29.351 00:05:16 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:29.351 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 3484472 00:10:29.351 Running I/O for 5 seconds... 00:10:29.351 task offset: 168896 on job bdev=EE_Dev_1 fails 00:10:29.351 00:10:29.351 Latency(us) 00:10:29.351 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:29.351 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:29.351 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:29.351 EE_Dev_1 : 0.00 23255.81 90.84 5285.41 0.00 466.51 166.51 829.89 00:10:29.351 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:29.351 Dev_2 : 0.00 14304.87 55.88 0.00 0.00 835.44 161.17 1552.92 00:10:29.351 =================================================================================================================== 00:10:29.351 Total : 37560.69 146.72 5285.41 0.00 666.60 161.17 1552.92 00:10:29.351 [2024-07-16 00:05:16.286897] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:29.351 request: 00:10:29.351 { 00:10:29.351 "method": "perform_tests", 00:10:29.351 "req_id": 1 00:10:29.351 } 00:10:29.351 Got JSON-RPC error response 00:10:29.351 response: 00:10:29.351 { 00:10:29.351 "code": -32603, 00:10:29.351 "message": "bdevperf failed with error Operation not permitted" 00:10:29.351 } 00:10:29.919 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:29.919 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:29.919 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:29.919 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:29.919 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:29.919 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:29.919 00:10:29.919 real 0m9.248s 00:10:29.919 user 0m9.478s 00:10:29.919 sys 0m1.032s 00:10:29.919 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:29.919 00:05:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.919 ************************************ 00:10:29.919 END TEST bdev_error 00:10:29.919 ************************************ 00:10:29.919 00:05:16 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:29.919 00:05:16 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:29.919 00:05:16 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:29.919 00:05:16 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:29.919 00:05:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:29.919 ************************************ 00:10:29.919 START TEST bdev_stat 00:10:29.919 ************************************ 00:10:29.919 00:05:16 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:10:29.919 00:05:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:29.919 00:05:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=3484736 00:10:29.919 00:05:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 3484736' 00:10:29.919 Process Bdev IO statistics testing pid: 3484736 00:10:29.919 00:05:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:29.920 00:05:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:29.920 00:05:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 3484736 00:10:29.920 00:05:16 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 3484736 ']' 00:10:29.920 00:05:16 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:29.920 00:05:16 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:29.920 00:05:16 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:29.920 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:29.920 00:05:16 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:29.920 00:05:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:29.920 [2024-07-16 00:05:16.806040] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:10:29.920 [2024-07-16 00:05:16.806177] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3484736 ] 00:10:30.179 [2024-07-16 00:05:17.001268] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:30.179 [2024-07-16 00:05:17.099469] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:30.179 [2024-07-16 00:05:17.099474] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:30.747 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:30.747 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:10:30.747 00:05:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:30.747 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:30.747 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:31.006 Malloc_STAT 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:31.006 [ 00:10:31.006 { 00:10:31.006 "name": "Malloc_STAT", 00:10:31.006 "aliases": [ 00:10:31.006 "d9f81a80-402b-4762-bae3-99362ee0406e" 00:10:31.006 ], 00:10:31.006 "product_name": "Malloc disk", 00:10:31.006 "block_size": 512, 00:10:31.006 "num_blocks": 262144, 00:10:31.006 "uuid": "d9f81a80-402b-4762-bae3-99362ee0406e", 00:10:31.006 "assigned_rate_limits": { 00:10:31.006 "rw_ios_per_sec": 0, 00:10:31.006 "rw_mbytes_per_sec": 0, 00:10:31.006 "r_mbytes_per_sec": 0, 00:10:31.006 "w_mbytes_per_sec": 0 00:10:31.006 }, 00:10:31.006 "claimed": false, 00:10:31.006 "zoned": false, 00:10:31.006 "supported_io_types": { 00:10:31.006 "read": true, 00:10:31.006 "write": true, 00:10:31.006 "unmap": true, 00:10:31.006 "flush": true, 00:10:31.006 "reset": true, 00:10:31.006 "nvme_admin": false, 00:10:31.006 "nvme_io": false, 00:10:31.006 "nvme_io_md": false, 00:10:31.006 "write_zeroes": true, 00:10:31.006 "zcopy": true, 00:10:31.006 "get_zone_info": false, 00:10:31.006 "zone_management": false, 00:10:31.006 "zone_append": false, 00:10:31.006 "compare": false, 00:10:31.006 "compare_and_write": false, 00:10:31.006 "abort": true, 00:10:31.006 "seek_hole": false, 00:10:31.006 "seek_data": false, 00:10:31.006 "copy": true, 00:10:31.006 "nvme_iov_md": false 00:10:31.006 }, 00:10:31.006 "memory_domains": [ 00:10:31.006 { 00:10:31.006 "dma_device_id": "system", 00:10:31.006 "dma_device_type": 1 00:10:31.006 }, 00:10:31.006 { 00:10:31.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.006 "dma_device_type": 2 00:10:31.006 } 00:10:31.006 ], 00:10:31.006 "driver_specific": {} 00:10:31.006 } 00:10:31.006 ] 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:31.006 00:05:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:31.006 Running I/O for 10 seconds... 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:32.915 "tick_rate": 2300000000, 00:10:32.915 "ticks": 5447248539317678, 00:10:32.915 "bdevs": [ 00:10:32.915 { 00:10:32.915 "name": "Malloc_STAT", 00:10:32.915 "bytes_read": 698397184, 00:10:32.915 "num_read_ops": 170500, 00:10:32.915 "bytes_written": 0, 00:10:32.915 "num_write_ops": 0, 00:10:32.915 "bytes_unmapped": 0, 00:10:32.915 "num_unmap_ops": 0, 00:10:32.915 "bytes_copied": 0, 00:10:32.915 "num_copy_ops": 0, 00:10:32.915 "read_latency_ticks": 2228906004968, 00:10:32.915 "max_read_latency_ticks": 17623934, 00:10:32.915 "min_read_latency_ticks": 268500, 00:10:32.915 "write_latency_ticks": 0, 00:10:32.915 "max_write_latency_ticks": 0, 00:10:32.915 "min_write_latency_ticks": 0, 00:10:32.915 "unmap_latency_ticks": 0, 00:10:32.915 "max_unmap_latency_ticks": 0, 00:10:32.915 "min_unmap_latency_ticks": 0, 00:10:32.915 "copy_latency_ticks": 0, 00:10:32.915 "max_copy_latency_ticks": 0, 00:10:32.915 "min_copy_latency_ticks": 0, 00:10:32.915 "io_error": {} 00:10:32.915 } 00:10:32.915 ] 00:10:32.915 }' 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=170500 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:32.915 00:05:19 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:33.174 00:05:19 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.174 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:33.174 "tick_rate": 2300000000, 00:10:33.174 "ticks": 5447248695863660, 00:10:33.174 "name": "Malloc_STAT", 00:10:33.174 "channels": [ 00:10:33.174 { 00:10:33.174 "thread_id": 2, 00:10:33.174 "bytes_read": 399507456, 00:10:33.174 "num_read_ops": 97536, 00:10:33.174 "bytes_written": 0, 00:10:33.174 "num_write_ops": 0, 00:10:33.174 "bytes_unmapped": 0, 00:10:33.174 "num_unmap_ops": 0, 00:10:33.174 "bytes_copied": 0, 00:10:33.174 "num_copy_ops": 0, 00:10:33.174 "read_latency_ticks": 1153776723230, 00:10:33.174 "max_read_latency_ticks": 12652582, 00:10:33.174 "min_read_latency_ticks": 8256012, 00:10:33.174 "write_latency_ticks": 0, 00:10:33.174 "max_write_latency_ticks": 0, 00:10:33.174 "min_write_latency_ticks": 0, 00:10:33.174 "unmap_latency_ticks": 0, 00:10:33.174 "max_unmap_latency_ticks": 0, 00:10:33.174 "min_unmap_latency_ticks": 0, 00:10:33.174 "copy_latency_ticks": 0, 00:10:33.174 "max_copy_latency_ticks": 0, 00:10:33.174 "min_copy_latency_ticks": 0 00:10:33.174 }, 00:10:33.174 { 00:10:33.175 "thread_id": 3, 00:10:33.175 "bytes_read": 324009984, 00:10:33.175 "num_read_ops": 79104, 00:10:33.175 "bytes_written": 0, 00:10:33.175 "num_write_ops": 0, 00:10:33.175 "bytes_unmapped": 0, 00:10:33.175 "num_unmap_ops": 0, 00:10:33.175 "bytes_copied": 0, 00:10:33.175 "num_copy_ops": 0, 00:10:33.175 "read_latency_ticks": 1156069905644, 00:10:33.175 "max_read_latency_ticks": 17623934, 00:10:33.175 "min_read_latency_ticks": 9643926, 00:10:33.175 "write_latency_ticks": 0, 00:10:33.175 "max_write_latency_ticks": 0, 00:10:33.175 "min_write_latency_ticks": 0, 00:10:33.175 "unmap_latency_ticks": 0, 00:10:33.175 "max_unmap_latency_ticks": 0, 00:10:33.175 "min_unmap_latency_ticks": 0, 00:10:33.175 "copy_latency_ticks": 0, 00:10:33.175 "max_copy_latency_ticks": 0, 00:10:33.175 "min_copy_latency_ticks": 0 00:10:33.175 } 00:10:33.175 ] 00:10:33.175 }' 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=97536 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=97536 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=79104 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=176640 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:33.175 "tick_rate": 2300000000, 00:10:33.175 "ticks": 5447248980253952, 00:10:33.175 "bdevs": [ 00:10:33.175 { 00:10:33.175 "name": "Malloc_STAT", 00:10:33.175 "bytes_read": 768651776, 00:10:33.175 "num_read_ops": 187652, 00:10:33.175 "bytes_written": 0, 00:10:33.175 "num_write_ops": 0, 00:10:33.175 "bytes_unmapped": 0, 00:10:33.175 "num_unmap_ops": 0, 00:10:33.175 "bytes_copied": 0, 00:10:33.175 "num_copy_ops": 0, 00:10:33.175 "read_latency_ticks": 2453548380250, 00:10:33.175 "max_read_latency_ticks": 17623934, 00:10:33.175 "min_read_latency_ticks": 268500, 00:10:33.175 "write_latency_ticks": 0, 00:10:33.175 "max_write_latency_ticks": 0, 00:10:33.175 "min_write_latency_ticks": 0, 00:10:33.175 "unmap_latency_ticks": 0, 00:10:33.175 "max_unmap_latency_ticks": 0, 00:10:33.175 "min_unmap_latency_ticks": 0, 00:10:33.175 "copy_latency_ticks": 0, 00:10:33.175 "max_copy_latency_ticks": 0, 00:10:33.175 "min_copy_latency_ticks": 0, 00:10:33.175 "io_error": {} 00:10:33.175 } 00:10:33.175 ] 00:10:33.175 }' 00:10:33.175 00:05:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=187652 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 176640 -lt 170500 ']' 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 176640 -gt 187652 ']' 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:33.175 00:10:33.175 Latency(us) 00:10:33.175 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:33.175 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:33.175 Malloc_STAT : 2.17 49657.06 193.97 0.00 0.00 5141.83 2322.25 5584.81 00:10:33.175 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:33.175 Malloc_STAT : 2.17 40248.81 157.22 0.00 0.00 6344.42 1239.49 7693.36 00:10:33.175 =================================================================================================================== 00:10:33.175 Total : 89905.87 351.19 0.00 0.00 5680.70 1239.49 7693.36 00:10:33.175 0 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 3484736 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 3484736 ']' 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 3484736 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:33.175 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3484736 00:10:33.434 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:33.434 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:33.434 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3484736' 00:10:33.434 killing process with pid 3484736 00:10:33.434 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 3484736 00:10:33.434 Received shutdown signal, test time was about 2.245532 seconds 00:10:33.434 00:10:33.434 Latency(us) 00:10:33.434 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:33.434 =================================================================================================================== 00:10:33.434 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:33.434 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 3484736 00:10:33.434 00:05:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:33.434 00:10:33.434 real 0m3.660s 00:10:33.434 user 0m7.144s 00:10:33.434 sys 0m0.534s 00:10:33.434 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:33.434 00:05:20 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:33.434 ************************************ 00:10:33.434 END TEST bdev_stat 00:10:33.434 ************************************ 00:10:33.693 00:05:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:33.693 00:05:20 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:33.693 00:05:20 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:33.693 00:05:20 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:33.693 00:05:20 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:33.693 00:05:20 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:33.693 00:05:20 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:33.693 00:05:20 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:33.693 00:05:20 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:33.693 00:05:20 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:33.693 00:05:20 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:33.693 00:10:33.693 real 1m59.289s 00:10:33.693 user 7m14.561s 00:10:33.693 sys 0m24.551s 00:10:33.693 00:05:20 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:33.693 00:05:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:33.693 ************************************ 00:10:33.693 END TEST blockdev_general 00:10:33.693 ************************************ 00:10:33.693 00:05:20 -- common/autotest_common.sh@1142 -- # return 0 00:10:33.693 00:05:20 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:33.693 00:05:20 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:33.693 00:05:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:33.693 00:05:20 -- common/autotest_common.sh@10 -- # set +x 00:10:33.693 ************************************ 00:10:33.693 START TEST bdev_raid 00:10:33.693 ************************************ 00:10:33.693 00:05:20 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:33.694 * Looking for test storage... 00:10:33.694 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:33.694 00:05:20 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:33.694 00:05:20 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:33.694 00:05:20 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:33.694 00:05:20 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:10:33.694 00:05:20 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:10:33.694 00:05:20 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:10:33.694 00:05:20 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:10:33.694 00:05:20 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:10:33.694 00:05:20 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:10:33.694 00:05:20 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:10:33.694 00:05:20 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:10:33.953 00:05:20 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:33.953 00:05:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:33.953 00:05:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:33.953 00:05:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:33.953 ************************************ 00:10:33.953 START TEST raid_function_test_raid0 00:10:33.953 ************************************ 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=3485344 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 3485344' 00:10:33.953 Process raid pid: 3485344 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 3485344 /var/tmp/spdk-raid.sock 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 3485344 ']' 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:33.953 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:33.953 00:05:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:33.953 [2024-07-16 00:05:20.743428] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:10:33.953 [2024-07-16 00:05:20.743499] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:33.953 [2024-07-16 00:05:20.871837] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:34.212 [2024-07-16 00:05:20.976430] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:34.212 [2024-07-16 00:05:21.032181] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:34.212 [2024-07-16 00:05:21.032207] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:34.780 00:05:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:34.780 00:05:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:10:34.781 00:05:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:10:34.781 00:05:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:10:34.781 00:05:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:34.781 00:05:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:10:34.781 00:05:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:35.040 [2024-07-16 00:05:21.873147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:35.040 [2024-07-16 00:05:21.874606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:35.040 [2024-07-16 00:05:21.874665] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf2bbd0 00:10:35.040 [2024-07-16 00:05:21.874675] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:35.040 [2024-07-16 00:05:21.874866] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf2bb10 00:10:35.040 [2024-07-16 00:05:21.874997] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf2bbd0 00:10:35.040 [2024-07-16 00:05:21.875007] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xf2bbd0 00:10:35.040 [2024-07-16 00:05:21.875107] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:35.040 Base_1 00:10:35.040 Base_2 00:10:35.040 00:05:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:35.040 00:05:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:35.040 00:05:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:35.300 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:35.560 [2024-07-16 00:05:22.314321] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10df8e0 00:10:35.560 /dev/nbd0 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:35.560 1+0 records in 00:10:35.560 1+0 records out 00:10:35.560 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000143338 s, 28.6 MB/s 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:35.560 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:35.819 { 00:10:35.819 "nbd_device": "/dev/nbd0", 00:10:35.819 "bdev_name": "raid" 00:10:35.819 } 00:10:35.819 ]' 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:35.819 { 00:10:35.819 "nbd_device": "/dev/nbd0", 00:10:35.819 "bdev_name": "raid" 00:10:35.819 } 00:10:35.819 ]' 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:35.819 4096+0 records in 00:10:35.819 4096+0 records out 00:10:35.819 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0314998 s, 66.6 MB/s 00:10:35.819 00:05:22 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:36.387 4096+0 records in 00:10:36.387 4096+0 records out 00:10:36.387 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.317166 s, 6.6 MB/s 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:36.387 128+0 records in 00:10:36.387 128+0 records out 00:10:36.387 65536 bytes (66 kB, 64 KiB) copied, 0.000864706 s, 75.8 MB/s 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:36.387 2035+0 records in 00:10:36.387 2035+0 records out 00:10:36.387 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0118385 s, 88.0 MB/s 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:36.387 456+0 records in 00:10:36.387 456+0 records out 00:10:36.387 233472 bytes (233 kB, 228 KiB) copied, 0.00276998 s, 84.3 MB/s 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:36.387 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:36.646 [2024-07-16 00:05:23.427814] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:36.646 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 3485344 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 3485344 ']' 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 3485344 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3485344 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3485344' 00:10:36.906 killing process with pid 3485344 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 3485344 00:10:36.906 [2024-07-16 00:05:23.813804] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:36.906 [2024-07-16 00:05:23.813870] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:36.906 [2024-07-16 00:05:23.813910] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:36.906 [2024-07-16 00:05:23.813923] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf2bbd0 name raid, state offline 00:10:36.906 00:05:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 3485344 00:10:36.906 [2024-07-16 00:05:23.830122] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:37.165 00:05:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:10:37.165 00:10:37.165 real 0m3.352s 00:10:37.165 user 0m4.388s 00:10:37.165 sys 0m1.247s 00:10:37.165 00:05:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:37.165 00:05:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:37.165 ************************************ 00:10:37.165 END TEST raid_function_test_raid0 00:10:37.165 ************************************ 00:10:37.165 00:05:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:37.165 00:05:24 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:10:37.165 00:05:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:37.165 00:05:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:37.165 00:05:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:37.165 ************************************ 00:10:37.165 START TEST raid_function_test_concat 00:10:37.165 ************************************ 00:10:37.165 00:05:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:10:37.165 00:05:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:10:37.165 00:05:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:37.165 00:05:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:37.425 00:05:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=3485796 00:10:37.425 00:05:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 3485796' 00:10:37.425 Process raid pid: 3485796 00:10:37.425 00:05:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:37.425 00:05:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 3485796 /var/tmp/spdk-raid.sock 00:10:37.425 00:05:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 3485796 ']' 00:10:37.425 00:05:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:37.425 00:05:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:37.425 00:05:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:37.425 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:37.425 00:05:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:37.425 00:05:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:37.425 [2024-07-16 00:05:24.168726] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:10:37.425 [2024-07-16 00:05:24.168780] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:37.425 [2024-07-16 00:05:24.283971] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:37.684 [2024-07-16 00:05:24.387673] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:37.684 [2024-07-16 00:05:24.449717] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:37.684 [2024-07-16 00:05:24.449755] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:38.252 00:05:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:38.252 00:05:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:10:38.252 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:10:38.252 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:10:38.252 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:38.252 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:10:38.252 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:38.512 [2024-07-16 00:05:25.375191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:38.512 [2024-07-16 00:05:25.376654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:38.512 [2024-07-16 00:05:25.376709] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x138cbd0 00:10:38.512 [2024-07-16 00:05:25.376720] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:38.512 [2024-07-16 00:05:25.376911] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x138cb10 00:10:38.512 [2024-07-16 00:05:25.377038] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x138cbd0 00:10:38.512 [2024-07-16 00:05:25.377049] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x138cbd0 00:10:38.512 [2024-07-16 00:05:25.377151] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:38.512 Base_1 00:10:38.512 Base_2 00:10:38.512 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:38.512 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:38.512 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:38.770 00:05:25 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:39.337 [2024-07-16 00:05:26.141249] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15408e0 00:10:39.337 /dev/nbd0 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:39.337 1+0 records in 00:10:39.337 1+0 records out 00:10:39.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264204 s, 15.5 MB/s 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:39.337 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:39.595 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:39.595 { 00:10:39.595 "nbd_device": "/dev/nbd0", 00:10:39.595 "bdev_name": "raid" 00:10:39.595 } 00:10:39.595 ]' 00:10:39.595 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:39.595 { 00:10:39.595 "nbd_device": "/dev/nbd0", 00:10:39.595 "bdev_name": "raid" 00:10:39.595 } 00:10:39.595 ]' 00:10:39.595 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:39.596 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:39.855 4096+0 records in 00:10:39.855 4096+0 records out 00:10:39.855 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0294887 s, 71.1 MB/s 00:10:39.855 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:40.113 4096+0 records in 00:10:40.113 4096+0 records out 00:10:40.113 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.295336 s, 7.1 MB/s 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:40.113 128+0 records in 00:10:40.113 128+0 records out 00:10:40.113 65536 bytes (66 kB, 64 KiB) copied, 0.000856409 s, 76.5 MB/s 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:40.113 2035+0 records in 00:10:40.113 2035+0 records out 00:10:40.113 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0112837 s, 92.3 MB/s 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:40.113 456+0 records in 00:10:40.113 456+0 records out 00:10:40.113 233472 bytes (233 kB, 228 KiB) copied, 0.00274609 s, 85.0 MB/s 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:40.113 00:05:26 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:40.371 [2024-07-16 00:05:27.241759] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:40.371 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 3485796 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 3485796 ']' 00:10:40.629 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 3485796 00:10:40.887 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:10:40.887 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:40.887 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3485796 00:10:40.887 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:40.887 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:40.887 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3485796' 00:10:40.887 killing process with pid 3485796 00:10:40.887 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 3485796 00:10:40.887 [2024-07-16 00:05:27.627230] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:40.887 [2024-07-16 00:05:27.627296] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:40.887 [2024-07-16 00:05:27.627337] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:40.887 [2024-07-16 00:05:27.627352] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x138cbd0 name raid, state offline 00:10:40.887 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 3485796 00:10:40.887 [2024-07-16 00:05:27.644257] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:41.145 00:05:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:10:41.145 00:10:41.145 real 0m3.751s 00:10:41.145 user 0m5.091s 00:10:41.145 sys 0m1.328s 00:10:41.145 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:41.145 00:05:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:41.145 ************************************ 00:10:41.145 END TEST raid_function_test_concat 00:10:41.145 ************************************ 00:10:41.145 00:05:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:41.145 00:05:27 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:10:41.145 00:05:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:41.145 00:05:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:41.145 00:05:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:41.145 ************************************ 00:10:41.145 START TEST raid0_resize_test 00:10:41.145 ************************************ 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=3486407 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 3486407' 00:10:41.145 Process raid pid: 3486407 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 3486407 /var/tmp/spdk-raid.sock 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 3486407 ']' 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:41.145 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:41.145 00:05:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:41.145 [2024-07-16 00:05:28.010670] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:10:41.145 [2024-07-16 00:05:28.010732] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:41.403 [2024-07-16 00:05:28.140910] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:41.403 [2024-07-16 00:05:28.242900] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:41.403 [2024-07-16 00:05:28.301705] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:41.403 [2024-07-16 00:05:28.301730] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:42.396 00:05:29 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:42.396 00:05:29 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:10:42.396 00:05:29 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:10:42.965 Base_1 00:10:42.965 00:05:29 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:10:43.224 Base_2 00:10:43.224 00:05:29 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:10:43.792 [2024-07-16 00:05:30.443140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:43.792 [2024-07-16 00:05:30.444572] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:43.792 [2024-07-16 00:05:30.444627] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22b6780 00:10:43.792 [2024-07-16 00:05:30.444637] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:43.792 [2024-07-16 00:05:30.444848] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e02020 00:10:43.792 [2024-07-16 00:05:30.444949] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22b6780 00:10:43.792 [2024-07-16 00:05:30.444959] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x22b6780 00:10:43.792 [2024-07-16 00:05:30.445076] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:43.792 00:05:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:10:44.051 [2024-07-16 00:05:30.956481] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:44.051 [2024-07-16 00:05:30.956506] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:10:44.051 true 00:10:44.051 00:05:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:44.051 00:05:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:10:44.309 [2024-07-16 00:05:31.213317] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:44.309 00:05:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:10:44.309 00:05:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:10:44.309 00:05:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:10:44.309 00:05:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:10:44.877 [2024-07-16 00:05:31.710440] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:44.877 [2024-07-16 00:05:31.710460] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:10:44.877 [2024-07-16 00:05:31.710484] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:10:44.877 true 00:10:44.877 00:05:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:44.877 00:05:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:10:45.136 [2024-07-16 00:05:31.967268] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:45.136 00:05:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:10:45.136 00:05:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:10:45.136 00:05:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:10:45.136 00:05:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 3486407 00:10:45.136 00:05:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 3486407 ']' 00:10:45.136 00:05:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 3486407 00:10:45.136 00:05:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:10:45.136 00:05:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:45.136 00:05:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3486407 00:10:45.136 00:05:32 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:45.136 00:05:32 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:45.136 00:05:32 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3486407' 00:10:45.136 killing process with pid 3486407 00:10:45.136 00:05:32 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 3486407 00:10:45.136 [2024-07-16 00:05:32.038471] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:45.136 [2024-07-16 00:05:32.038524] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:45.136 00:05:32 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 3486407 00:10:45.136 [2024-07-16 00:05:32.038564] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:45.136 [2024-07-16 00:05:32.038575] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b6780 name Raid, state offline 00:10:45.136 [2024-07-16 00:05:32.039932] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:45.395 00:05:32 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:10:45.395 00:10:45.395 real 0m4.294s 00:10:45.395 user 0m7.034s 00:10:45.395 sys 0m0.812s 00:10:45.395 00:05:32 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:45.395 00:05:32 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:45.395 ************************************ 00:10:45.395 END TEST raid0_resize_test 00:10:45.395 ************************************ 00:10:45.395 00:05:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:45.395 00:05:32 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:45.395 00:05:32 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:45.395 00:05:32 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:10:45.395 00:05:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:45.395 00:05:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:45.395 00:05:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:45.395 ************************************ 00:10:45.395 START TEST raid_state_function_test 00:10:45.395 ************************************ 00:10:45.395 00:05:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:10:45.395 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:45.395 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:45.395 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:45.395 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:45.395 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:45.395 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:45.395 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:45.395 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:45.395 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3486973 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3486973' 00:10:45.396 Process raid pid: 3486973 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3486973 /var/tmp/spdk-raid.sock 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3486973 ']' 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:45.396 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:45.396 00:05:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:45.655 [2024-07-16 00:05:32.394360] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:10:45.655 [2024-07-16 00:05:32.394428] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:45.655 [2024-07-16 00:05:32.526317] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:45.914 [2024-07-16 00:05:32.629729] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:45.914 [2024-07-16 00:05:32.689201] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:45.914 [2024-07-16 00:05:32.689237] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:46.858 00:05:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:46.858 00:05:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:46.858 00:05:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:47.426 [2024-07-16 00:05:34.070858] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:47.426 [2024-07-16 00:05:34.070900] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:47.426 [2024-07-16 00:05:34.070911] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:47.426 [2024-07-16 00:05:34.070923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.426 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:47.685 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:47.685 "name": "Existed_Raid", 00:10:47.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.685 "strip_size_kb": 64, 00:10:47.685 "state": "configuring", 00:10:47.685 "raid_level": "raid0", 00:10:47.685 "superblock": false, 00:10:47.685 "num_base_bdevs": 2, 00:10:47.685 "num_base_bdevs_discovered": 0, 00:10:47.685 "num_base_bdevs_operational": 2, 00:10:47.685 "base_bdevs_list": [ 00:10:47.685 { 00:10:47.685 "name": "BaseBdev1", 00:10:47.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.685 "is_configured": false, 00:10:47.685 "data_offset": 0, 00:10:47.685 "data_size": 0 00:10:47.685 }, 00:10:47.685 { 00:10:47.685 "name": "BaseBdev2", 00:10:47.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.685 "is_configured": false, 00:10:47.685 "data_offset": 0, 00:10:47.685 "data_size": 0 00:10:47.685 } 00:10:47.685 ] 00:10:47.685 }' 00:10:47.685 00:05:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:47.685 00:05:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:48.622 00:05:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:48.881 [2024-07-16 00:05:35.707049] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:48.881 [2024-07-16 00:05:35.707082] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7b4a80 name Existed_Raid, state configuring 00:10:48.881 00:05:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:49.449 [2024-07-16 00:05:36.208378] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:49.449 [2024-07-16 00:05:36.208419] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:49.449 [2024-07-16 00:05:36.208429] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:49.449 [2024-07-16 00:05:36.208441] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:49.449 00:05:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:49.709 [2024-07-16 00:05:36.479052] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:49.709 BaseBdev1 00:10:49.709 00:05:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:49.709 00:05:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:49.709 00:05:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:49.709 00:05:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:49.709 00:05:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:49.709 00:05:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:49.709 00:05:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:50.277 00:05:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:50.846 [ 00:10:50.846 { 00:10:50.846 "name": "BaseBdev1", 00:10:50.846 "aliases": [ 00:10:50.846 "b9851e04-6d54-4c60-8437-090ee9e15d96" 00:10:50.846 ], 00:10:50.846 "product_name": "Malloc disk", 00:10:50.846 "block_size": 512, 00:10:50.846 "num_blocks": 65536, 00:10:50.846 "uuid": "b9851e04-6d54-4c60-8437-090ee9e15d96", 00:10:50.846 "assigned_rate_limits": { 00:10:50.846 "rw_ios_per_sec": 0, 00:10:50.846 "rw_mbytes_per_sec": 0, 00:10:50.846 "r_mbytes_per_sec": 0, 00:10:50.846 "w_mbytes_per_sec": 0 00:10:50.846 }, 00:10:50.846 "claimed": true, 00:10:50.846 "claim_type": "exclusive_write", 00:10:50.846 "zoned": false, 00:10:50.846 "supported_io_types": { 00:10:50.846 "read": true, 00:10:50.846 "write": true, 00:10:50.846 "unmap": true, 00:10:50.846 "flush": true, 00:10:50.846 "reset": true, 00:10:50.846 "nvme_admin": false, 00:10:50.846 "nvme_io": false, 00:10:50.846 "nvme_io_md": false, 00:10:50.846 "write_zeroes": true, 00:10:50.846 "zcopy": true, 00:10:50.846 "get_zone_info": false, 00:10:50.846 "zone_management": false, 00:10:50.846 "zone_append": false, 00:10:50.846 "compare": false, 00:10:50.846 "compare_and_write": false, 00:10:50.846 "abort": true, 00:10:50.846 "seek_hole": false, 00:10:50.846 "seek_data": false, 00:10:50.846 "copy": true, 00:10:50.846 "nvme_iov_md": false 00:10:50.846 }, 00:10:50.846 "memory_domains": [ 00:10:50.846 { 00:10:50.846 "dma_device_id": "system", 00:10:50.846 "dma_device_type": 1 00:10:50.846 }, 00:10:50.846 { 00:10:50.846 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.846 "dma_device_type": 2 00:10:50.846 } 00:10:50.846 ], 00:10:50.846 "driver_specific": {} 00:10:50.846 } 00:10:50.846 ] 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:50.846 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:50.847 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:50.847 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:50.847 "name": "Existed_Raid", 00:10:50.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:50.847 "strip_size_kb": 64, 00:10:50.847 "state": "configuring", 00:10:50.847 "raid_level": "raid0", 00:10:50.847 "superblock": false, 00:10:50.847 "num_base_bdevs": 2, 00:10:50.847 "num_base_bdevs_discovered": 1, 00:10:50.847 "num_base_bdevs_operational": 2, 00:10:50.847 "base_bdevs_list": [ 00:10:50.847 { 00:10:50.847 "name": "BaseBdev1", 00:10:50.847 "uuid": "b9851e04-6d54-4c60-8437-090ee9e15d96", 00:10:50.847 "is_configured": true, 00:10:50.847 "data_offset": 0, 00:10:50.847 "data_size": 65536 00:10:50.847 }, 00:10:50.847 { 00:10:50.847 "name": "BaseBdev2", 00:10:50.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:50.847 "is_configured": false, 00:10:50.847 "data_offset": 0, 00:10:50.847 "data_size": 0 00:10:50.847 } 00:10:50.847 ] 00:10:50.847 }' 00:10:50.847 00:05:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:50.847 00:05:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.784 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:51.784 [2024-07-16 00:05:38.596683] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:51.784 [2024-07-16 00:05:38.596721] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7b4350 name Existed_Raid, state configuring 00:10:51.784 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:52.043 [2024-07-16 00:05:38.841357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:52.043 [2024-07-16 00:05:38.842833] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:52.043 [2024-07-16 00:05:38.842865] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.043 00:05:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:52.303 00:05:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:52.303 "name": "Existed_Raid", 00:10:52.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:52.303 "strip_size_kb": 64, 00:10:52.303 "state": "configuring", 00:10:52.303 "raid_level": "raid0", 00:10:52.303 "superblock": false, 00:10:52.303 "num_base_bdevs": 2, 00:10:52.303 "num_base_bdevs_discovered": 1, 00:10:52.303 "num_base_bdevs_operational": 2, 00:10:52.303 "base_bdevs_list": [ 00:10:52.303 { 00:10:52.303 "name": "BaseBdev1", 00:10:52.303 "uuid": "b9851e04-6d54-4c60-8437-090ee9e15d96", 00:10:52.303 "is_configured": true, 00:10:52.303 "data_offset": 0, 00:10:52.303 "data_size": 65536 00:10:52.303 }, 00:10:52.303 { 00:10:52.303 "name": "BaseBdev2", 00:10:52.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:52.303 "is_configured": false, 00:10:52.303 "data_offset": 0, 00:10:52.303 "data_size": 0 00:10:52.303 } 00:10:52.303 ] 00:10:52.303 }' 00:10:52.303 00:05:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:52.303 00:05:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:52.873 00:05:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:53.132 [2024-07-16 00:05:39.943694] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:53.132 [2024-07-16 00:05:39.943731] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x7b5000 00:10:53.132 [2024-07-16 00:05:39.943739] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:53.132 [2024-07-16 00:05:39.943923] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x6cf0c0 00:10:53.132 [2024-07-16 00:05:39.944053] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7b5000 00:10:53.132 [2024-07-16 00:05:39.944063] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x7b5000 00:10:53.132 [2024-07-16 00:05:39.944226] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:53.132 BaseBdev2 00:10:53.132 00:05:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:53.132 00:05:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:53.132 00:05:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:53.132 00:05:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:53.132 00:05:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:53.132 00:05:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:53.132 00:05:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:53.391 00:05:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:53.650 [ 00:10:53.650 { 00:10:53.650 "name": "BaseBdev2", 00:10:53.650 "aliases": [ 00:10:53.650 "e0322339-5e75-4d7f-92ac-2eaefc46da67" 00:10:53.650 ], 00:10:53.650 "product_name": "Malloc disk", 00:10:53.650 "block_size": 512, 00:10:53.650 "num_blocks": 65536, 00:10:53.650 "uuid": "e0322339-5e75-4d7f-92ac-2eaefc46da67", 00:10:53.650 "assigned_rate_limits": { 00:10:53.650 "rw_ios_per_sec": 0, 00:10:53.650 "rw_mbytes_per_sec": 0, 00:10:53.650 "r_mbytes_per_sec": 0, 00:10:53.650 "w_mbytes_per_sec": 0 00:10:53.650 }, 00:10:53.650 "claimed": true, 00:10:53.650 "claim_type": "exclusive_write", 00:10:53.650 "zoned": false, 00:10:53.650 "supported_io_types": { 00:10:53.650 "read": true, 00:10:53.650 "write": true, 00:10:53.650 "unmap": true, 00:10:53.650 "flush": true, 00:10:53.650 "reset": true, 00:10:53.650 "nvme_admin": false, 00:10:53.650 "nvme_io": false, 00:10:53.650 "nvme_io_md": false, 00:10:53.650 "write_zeroes": true, 00:10:53.650 "zcopy": true, 00:10:53.650 "get_zone_info": false, 00:10:53.650 "zone_management": false, 00:10:53.650 "zone_append": false, 00:10:53.650 "compare": false, 00:10:53.650 "compare_and_write": false, 00:10:53.650 "abort": true, 00:10:53.650 "seek_hole": false, 00:10:53.650 "seek_data": false, 00:10:53.650 "copy": true, 00:10:53.650 "nvme_iov_md": false 00:10:53.650 }, 00:10:53.650 "memory_domains": [ 00:10:53.650 { 00:10:53.650 "dma_device_id": "system", 00:10:53.650 "dma_device_type": 1 00:10:53.650 }, 00:10:53.650 { 00:10:53.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.650 "dma_device_type": 2 00:10:53.650 } 00:10:53.650 ], 00:10:53.650 "driver_specific": {} 00:10:53.650 } 00:10:53.650 ] 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:53.650 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:53.909 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:53.909 "name": "Existed_Raid", 00:10:53.909 "uuid": "d5686727-07de-496f-9e6e-43ffd934a111", 00:10:53.909 "strip_size_kb": 64, 00:10:53.909 "state": "online", 00:10:53.909 "raid_level": "raid0", 00:10:53.909 "superblock": false, 00:10:53.909 "num_base_bdevs": 2, 00:10:53.909 "num_base_bdevs_discovered": 2, 00:10:53.909 "num_base_bdevs_operational": 2, 00:10:53.909 "base_bdevs_list": [ 00:10:53.909 { 00:10:53.909 "name": "BaseBdev1", 00:10:53.909 "uuid": "b9851e04-6d54-4c60-8437-090ee9e15d96", 00:10:53.909 "is_configured": true, 00:10:53.909 "data_offset": 0, 00:10:53.909 "data_size": 65536 00:10:53.909 }, 00:10:53.909 { 00:10:53.909 "name": "BaseBdev2", 00:10:53.909 "uuid": "e0322339-5e75-4d7f-92ac-2eaefc46da67", 00:10:53.909 "is_configured": true, 00:10:53.909 "data_offset": 0, 00:10:53.909 "data_size": 65536 00:10:53.909 } 00:10:53.909 ] 00:10:53.909 }' 00:10:53.909 00:05:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:53.909 00:05:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:54.480 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:54.480 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:54.480 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:54.480 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:54.480 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:54.480 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:54.480 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:54.480 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:54.740 [2024-07-16 00:05:41.528174] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:54.740 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:54.740 "name": "Existed_Raid", 00:10:54.740 "aliases": [ 00:10:54.740 "d5686727-07de-496f-9e6e-43ffd934a111" 00:10:54.740 ], 00:10:54.740 "product_name": "Raid Volume", 00:10:54.740 "block_size": 512, 00:10:54.740 "num_blocks": 131072, 00:10:54.740 "uuid": "d5686727-07de-496f-9e6e-43ffd934a111", 00:10:54.740 "assigned_rate_limits": { 00:10:54.740 "rw_ios_per_sec": 0, 00:10:54.740 "rw_mbytes_per_sec": 0, 00:10:54.740 "r_mbytes_per_sec": 0, 00:10:54.740 "w_mbytes_per_sec": 0 00:10:54.740 }, 00:10:54.740 "claimed": false, 00:10:54.740 "zoned": false, 00:10:54.740 "supported_io_types": { 00:10:54.740 "read": true, 00:10:54.740 "write": true, 00:10:54.740 "unmap": true, 00:10:54.740 "flush": true, 00:10:54.740 "reset": true, 00:10:54.740 "nvme_admin": false, 00:10:54.740 "nvme_io": false, 00:10:54.740 "nvme_io_md": false, 00:10:54.740 "write_zeroes": true, 00:10:54.740 "zcopy": false, 00:10:54.740 "get_zone_info": false, 00:10:54.740 "zone_management": false, 00:10:54.740 "zone_append": false, 00:10:54.740 "compare": false, 00:10:54.740 "compare_and_write": false, 00:10:54.740 "abort": false, 00:10:54.740 "seek_hole": false, 00:10:54.740 "seek_data": false, 00:10:54.740 "copy": false, 00:10:54.740 "nvme_iov_md": false 00:10:54.740 }, 00:10:54.740 "memory_domains": [ 00:10:54.740 { 00:10:54.740 "dma_device_id": "system", 00:10:54.740 "dma_device_type": 1 00:10:54.740 }, 00:10:54.740 { 00:10:54.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.740 "dma_device_type": 2 00:10:54.740 }, 00:10:54.740 { 00:10:54.740 "dma_device_id": "system", 00:10:54.740 "dma_device_type": 1 00:10:54.740 }, 00:10:54.740 { 00:10:54.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.740 "dma_device_type": 2 00:10:54.740 } 00:10:54.740 ], 00:10:54.740 "driver_specific": { 00:10:54.740 "raid": { 00:10:54.740 "uuid": "d5686727-07de-496f-9e6e-43ffd934a111", 00:10:54.740 "strip_size_kb": 64, 00:10:54.740 "state": "online", 00:10:54.740 "raid_level": "raid0", 00:10:54.740 "superblock": false, 00:10:54.740 "num_base_bdevs": 2, 00:10:54.740 "num_base_bdevs_discovered": 2, 00:10:54.740 "num_base_bdevs_operational": 2, 00:10:54.740 "base_bdevs_list": [ 00:10:54.740 { 00:10:54.740 "name": "BaseBdev1", 00:10:54.740 "uuid": "b9851e04-6d54-4c60-8437-090ee9e15d96", 00:10:54.740 "is_configured": true, 00:10:54.740 "data_offset": 0, 00:10:54.740 "data_size": 65536 00:10:54.740 }, 00:10:54.740 { 00:10:54.740 "name": "BaseBdev2", 00:10:54.740 "uuid": "e0322339-5e75-4d7f-92ac-2eaefc46da67", 00:10:54.740 "is_configured": true, 00:10:54.740 "data_offset": 0, 00:10:54.740 "data_size": 65536 00:10:54.740 } 00:10:54.740 ] 00:10:54.740 } 00:10:54.740 } 00:10:54.740 }' 00:10:54.740 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:54.740 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:54.740 BaseBdev2' 00:10:54.740 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:54.740 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:54.740 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:55.000 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:55.000 "name": "BaseBdev1", 00:10:55.000 "aliases": [ 00:10:55.000 "b9851e04-6d54-4c60-8437-090ee9e15d96" 00:10:55.000 ], 00:10:55.000 "product_name": "Malloc disk", 00:10:55.000 "block_size": 512, 00:10:55.000 "num_blocks": 65536, 00:10:55.000 "uuid": "b9851e04-6d54-4c60-8437-090ee9e15d96", 00:10:55.000 "assigned_rate_limits": { 00:10:55.000 "rw_ios_per_sec": 0, 00:10:55.000 "rw_mbytes_per_sec": 0, 00:10:55.000 "r_mbytes_per_sec": 0, 00:10:55.000 "w_mbytes_per_sec": 0 00:10:55.000 }, 00:10:55.000 "claimed": true, 00:10:55.000 "claim_type": "exclusive_write", 00:10:55.000 "zoned": false, 00:10:55.000 "supported_io_types": { 00:10:55.000 "read": true, 00:10:55.000 "write": true, 00:10:55.000 "unmap": true, 00:10:55.000 "flush": true, 00:10:55.000 "reset": true, 00:10:55.000 "nvme_admin": false, 00:10:55.000 "nvme_io": false, 00:10:55.000 "nvme_io_md": false, 00:10:55.000 "write_zeroes": true, 00:10:55.000 "zcopy": true, 00:10:55.000 "get_zone_info": false, 00:10:55.000 "zone_management": false, 00:10:55.000 "zone_append": false, 00:10:55.000 "compare": false, 00:10:55.000 "compare_and_write": false, 00:10:55.000 "abort": true, 00:10:55.000 "seek_hole": false, 00:10:55.000 "seek_data": false, 00:10:55.000 "copy": true, 00:10:55.000 "nvme_iov_md": false 00:10:55.000 }, 00:10:55.000 "memory_domains": [ 00:10:55.000 { 00:10:55.000 "dma_device_id": "system", 00:10:55.000 "dma_device_type": 1 00:10:55.000 }, 00:10:55.000 { 00:10:55.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.000 "dma_device_type": 2 00:10:55.000 } 00:10:55.000 ], 00:10:55.000 "driver_specific": {} 00:10:55.000 }' 00:10:55.000 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.000 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.000 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:55.000 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.258 00:05:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.258 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:55.258 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.258 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.258 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:55.258 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.258 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.258 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:55.258 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:55.258 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:55.258 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:55.516 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:55.516 "name": "BaseBdev2", 00:10:55.516 "aliases": [ 00:10:55.516 "e0322339-5e75-4d7f-92ac-2eaefc46da67" 00:10:55.516 ], 00:10:55.516 "product_name": "Malloc disk", 00:10:55.516 "block_size": 512, 00:10:55.516 "num_blocks": 65536, 00:10:55.516 "uuid": "e0322339-5e75-4d7f-92ac-2eaefc46da67", 00:10:55.516 "assigned_rate_limits": { 00:10:55.516 "rw_ios_per_sec": 0, 00:10:55.516 "rw_mbytes_per_sec": 0, 00:10:55.516 "r_mbytes_per_sec": 0, 00:10:55.516 "w_mbytes_per_sec": 0 00:10:55.516 }, 00:10:55.516 "claimed": true, 00:10:55.516 "claim_type": "exclusive_write", 00:10:55.516 "zoned": false, 00:10:55.516 "supported_io_types": { 00:10:55.516 "read": true, 00:10:55.516 "write": true, 00:10:55.516 "unmap": true, 00:10:55.516 "flush": true, 00:10:55.516 "reset": true, 00:10:55.516 "nvme_admin": false, 00:10:55.516 "nvme_io": false, 00:10:55.516 "nvme_io_md": false, 00:10:55.516 "write_zeroes": true, 00:10:55.516 "zcopy": true, 00:10:55.516 "get_zone_info": false, 00:10:55.516 "zone_management": false, 00:10:55.516 "zone_append": false, 00:10:55.516 "compare": false, 00:10:55.516 "compare_and_write": false, 00:10:55.516 "abort": true, 00:10:55.516 "seek_hole": false, 00:10:55.516 "seek_data": false, 00:10:55.516 "copy": true, 00:10:55.516 "nvme_iov_md": false 00:10:55.516 }, 00:10:55.516 "memory_domains": [ 00:10:55.516 { 00:10:55.516 "dma_device_id": "system", 00:10:55.516 "dma_device_type": 1 00:10:55.516 }, 00:10:55.516 { 00:10:55.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.516 "dma_device_type": 2 00:10:55.516 } 00:10:55.516 ], 00:10:55.516 "driver_specific": {} 00:10:55.516 }' 00:10:55.516 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.775 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.775 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:55.775 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.775 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.775 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:55.775 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.775 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.775 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:55.775 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:56.034 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:56.034 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:56.034 00:05:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:56.293 [2024-07-16 00:05:43.031958] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:56.293 [2024-07-16 00:05:43.031994] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:56.293 [2024-07-16 00:05:43.032034] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.293 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:56.561 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:56.561 "name": "Existed_Raid", 00:10:56.561 "uuid": "d5686727-07de-496f-9e6e-43ffd934a111", 00:10:56.561 "strip_size_kb": 64, 00:10:56.561 "state": "offline", 00:10:56.561 "raid_level": "raid0", 00:10:56.561 "superblock": false, 00:10:56.561 "num_base_bdevs": 2, 00:10:56.561 "num_base_bdevs_discovered": 1, 00:10:56.561 "num_base_bdevs_operational": 1, 00:10:56.561 "base_bdevs_list": [ 00:10:56.561 { 00:10:56.561 "name": null, 00:10:56.561 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:56.561 "is_configured": false, 00:10:56.561 "data_offset": 0, 00:10:56.561 "data_size": 65536 00:10:56.561 }, 00:10:56.561 { 00:10:56.561 "name": "BaseBdev2", 00:10:56.561 "uuid": "e0322339-5e75-4d7f-92ac-2eaefc46da67", 00:10:56.561 "is_configured": true, 00:10:56.561 "data_offset": 0, 00:10:56.561 "data_size": 65536 00:10:56.561 } 00:10:56.561 ] 00:10:56.561 }' 00:10:56.561 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:56.561 00:05:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:57.151 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:57.151 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:57.151 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.151 00:05:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:57.409 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:57.409 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:57.409 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:57.668 [2024-07-16 00:05:44.373419] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:57.668 [2024-07-16 00:05:44.373467] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7b5000 name Existed_Raid, state offline 00:10:57.668 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:57.668 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:57.668 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.668 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3486973 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3486973 ']' 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3486973 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3486973 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3486973' 00:10:57.926 killing process with pid 3486973 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3486973 00:10:57.926 [2024-07-16 00:05:44.695328] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:57.926 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3486973 00:10:57.926 [2024-07-16 00:05:44.696326] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:58.183 00:05:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:58.183 00:10:58.183 real 0m12.591s 00:10:58.183 user 0m22.553s 00:10:58.183 sys 0m2.223s 00:10:58.183 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:58.183 00:05:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.183 ************************************ 00:10:58.183 END TEST raid_state_function_test 00:10:58.183 ************************************ 00:10:58.183 00:05:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:58.183 00:05:44 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:10:58.183 00:05:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:58.183 00:05:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:58.183 00:05:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:58.183 ************************************ 00:10:58.183 START TEST raid_state_function_test_sb 00:10:58.183 ************************************ 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3488942 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3488942' 00:10:58.184 Process raid pid: 3488942 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3488942 /var/tmp/spdk-raid.sock 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3488942 ']' 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:58.184 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:58.184 00:05:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:58.184 [2024-07-16 00:05:45.076661] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:10:58.184 [2024-07-16 00:05:45.076729] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:58.441 [2024-07-16 00:05:45.208089] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:58.441 [2024-07-16 00:05:45.312777] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:58.441 [2024-07-16 00:05:45.373528] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:58.441 [2024-07-16 00:05:45.373560] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:58.700 00:05:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:58.700 00:05:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:58.700 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:58.959 [2024-07-16 00:05:45.756073] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:58.959 [2024-07-16 00:05:45.756112] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:58.959 [2024-07-16 00:05:45.756123] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:58.959 [2024-07-16 00:05:45.756136] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.959 00:05:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:59.217 00:05:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:59.217 "name": "Existed_Raid", 00:10:59.217 "uuid": "0372302e-fc4b-461b-81ac-4e640aefdcd2", 00:10:59.217 "strip_size_kb": 64, 00:10:59.217 "state": "configuring", 00:10:59.217 "raid_level": "raid0", 00:10:59.217 "superblock": true, 00:10:59.217 "num_base_bdevs": 2, 00:10:59.217 "num_base_bdevs_discovered": 0, 00:10:59.217 "num_base_bdevs_operational": 2, 00:10:59.217 "base_bdevs_list": [ 00:10:59.217 { 00:10:59.217 "name": "BaseBdev1", 00:10:59.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:59.217 "is_configured": false, 00:10:59.217 "data_offset": 0, 00:10:59.217 "data_size": 0 00:10:59.217 }, 00:10:59.217 { 00:10:59.217 "name": "BaseBdev2", 00:10:59.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:59.217 "is_configured": false, 00:10:59.217 "data_offset": 0, 00:10:59.217 "data_size": 0 00:10:59.217 } 00:10:59.217 ] 00:10:59.217 }' 00:10:59.217 00:05:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:59.217 00:05:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:59.783 00:05:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:00.041 [2024-07-16 00:05:46.766597] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:00.041 [2024-07-16 00:05:46.766623] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe90a80 name Existed_Raid, state configuring 00:11:00.041 00:05:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:00.041 [2024-07-16 00:05:46.935072] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:00.041 [2024-07-16 00:05:46.935096] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:00.041 [2024-07-16 00:05:46.935105] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:00.041 [2024-07-16 00:05:46.935116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:00.041 00:05:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:00.298 [2024-07-16 00:05:47.113277] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:00.298 BaseBdev1 00:11:00.298 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:00.298 00:05:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:00.298 00:05:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:00.298 00:05:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:00.298 00:05:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:00.298 00:05:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:00.298 00:05:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:00.556 [ 00:11:00.556 { 00:11:00.556 "name": "BaseBdev1", 00:11:00.556 "aliases": [ 00:11:00.556 "9b14e4e4-10ba-425a-8a09-1d55d8f966d4" 00:11:00.556 ], 00:11:00.556 "product_name": "Malloc disk", 00:11:00.556 "block_size": 512, 00:11:00.556 "num_blocks": 65536, 00:11:00.556 "uuid": "9b14e4e4-10ba-425a-8a09-1d55d8f966d4", 00:11:00.556 "assigned_rate_limits": { 00:11:00.556 "rw_ios_per_sec": 0, 00:11:00.556 "rw_mbytes_per_sec": 0, 00:11:00.556 "r_mbytes_per_sec": 0, 00:11:00.556 "w_mbytes_per_sec": 0 00:11:00.556 }, 00:11:00.556 "claimed": true, 00:11:00.556 "claim_type": "exclusive_write", 00:11:00.556 "zoned": false, 00:11:00.556 "supported_io_types": { 00:11:00.556 "read": true, 00:11:00.556 "write": true, 00:11:00.556 "unmap": true, 00:11:00.556 "flush": true, 00:11:00.556 "reset": true, 00:11:00.556 "nvme_admin": false, 00:11:00.556 "nvme_io": false, 00:11:00.556 "nvme_io_md": false, 00:11:00.556 "write_zeroes": true, 00:11:00.556 "zcopy": true, 00:11:00.556 "get_zone_info": false, 00:11:00.556 "zone_management": false, 00:11:00.556 "zone_append": false, 00:11:00.556 "compare": false, 00:11:00.556 "compare_and_write": false, 00:11:00.556 "abort": true, 00:11:00.556 "seek_hole": false, 00:11:00.556 "seek_data": false, 00:11:00.556 "copy": true, 00:11:00.556 "nvme_iov_md": false 00:11:00.556 }, 00:11:00.556 "memory_domains": [ 00:11:00.556 { 00:11:00.556 "dma_device_id": "system", 00:11:00.556 "dma_device_type": 1 00:11:00.556 }, 00:11:00.556 { 00:11:00.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.556 "dma_device_type": 2 00:11:00.556 } 00:11:00.556 ], 00:11:00.556 "driver_specific": {} 00:11:00.556 } 00:11:00.556 ] 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:00.556 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:00.814 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:00.814 "name": "Existed_Raid", 00:11:00.814 "uuid": "5307f88e-3ab2-4708-abe0-1bfb5283a45f", 00:11:00.814 "strip_size_kb": 64, 00:11:00.814 "state": "configuring", 00:11:00.814 "raid_level": "raid0", 00:11:00.814 "superblock": true, 00:11:00.814 "num_base_bdevs": 2, 00:11:00.814 "num_base_bdevs_discovered": 1, 00:11:00.814 "num_base_bdevs_operational": 2, 00:11:00.814 "base_bdevs_list": [ 00:11:00.814 { 00:11:00.814 "name": "BaseBdev1", 00:11:00.814 "uuid": "9b14e4e4-10ba-425a-8a09-1d55d8f966d4", 00:11:00.814 "is_configured": true, 00:11:00.814 "data_offset": 2048, 00:11:00.814 "data_size": 63488 00:11:00.814 }, 00:11:00.814 { 00:11:00.814 "name": "BaseBdev2", 00:11:00.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:00.814 "is_configured": false, 00:11:00.814 "data_offset": 0, 00:11:00.814 "data_size": 0 00:11:00.814 } 00:11:00.814 ] 00:11:00.814 }' 00:11:00.814 00:05:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:00.814 00:05:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:01.380 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:01.638 [2024-07-16 00:05:48.412735] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:01.638 [2024-07-16 00:05:48.412771] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe90350 name Existed_Raid, state configuring 00:11:01.638 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:01.638 [2024-07-16 00:05:48.589240] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:01.896 [2024-07-16 00:05:48.590793] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:01.896 [2024-07-16 00:05:48.590825] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:01.896 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.896 "name": "Existed_Raid", 00:11:01.896 "uuid": "8f15dba0-15b9-4359-adab-b91046e8dba7", 00:11:01.897 "strip_size_kb": 64, 00:11:01.897 "state": "configuring", 00:11:01.897 "raid_level": "raid0", 00:11:01.897 "superblock": true, 00:11:01.897 "num_base_bdevs": 2, 00:11:01.897 "num_base_bdevs_discovered": 1, 00:11:01.897 "num_base_bdevs_operational": 2, 00:11:01.897 "base_bdevs_list": [ 00:11:01.897 { 00:11:01.897 "name": "BaseBdev1", 00:11:01.897 "uuid": "9b14e4e4-10ba-425a-8a09-1d55d8f966d4", 00:11:01.897 "is_configured": true, 00:11:01.897 "data_offset": 2048, 00:11:01.897 "data_size": 63488 00:11:01.897 }, 00:11:01.897 { 00:11:01.897 "name": "BaseBdev2", 00:11:01.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.897 "is_configured": false, 00:11:01.897 "data_offset": 0, 00:11:01.897 "data_size": 0 00:11:01.897 } 00:11:01.897 ] 00:11:01.897 }' 00:11:01.897 00:05:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.897 00:05:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:02.462 00:05:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:02.720 [2024-07-16 00:05:49.639342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:02.720 [2024-07-16 00:05:49.639490] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe91000 00:11:02.720 [2024-07-16 00:05:49.639504] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:02.720 [2024-07-16 00:05:49.639674] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdab0c0 00:11:02.720 [2024-07-16 00:05:49.639788] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe91000 00:11:02.720 [2024-07-16 00:05:49.639798] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe91000 00:11:02.720 [2024-07-16 00:05:49.639886] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:02.720 BaseBdev2 00:11:02.720 00:05:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:02.720 00:05:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:02.720 00:05:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:02.720 00:05:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:02.720 00:05:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:02.720 00:05:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:02.720 00:05:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:02.979 00:05:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:03.237 [ 00:11:03.237 { 00:11:03.237 "name": "BaseBdev2", 00:11:03.237 "aliases": [ 00:11:03.237 "b2df8824-3de1-476b-9d69-be398752a6fc" 00:11:03.237 ], 00:11:03.237 "product_name": "Malloc disk", 00:11:03.237 "block_size": 512, 00:11:03.237 "num_blocks": 65536, 00:11:03.237 "uuid": "b2df8824-3de1-476b-9d69-be398752a6fc", 00:11:03.237 "assigned_rate_limits": { 00:11:03.237 "rw_ios_per_sec": 0, 00:11:03.237 "rw_mbytes_per_sec": 0, 00:11:03.237 "r_mbytes_per_sec": 0, 00:11:03.237 "w_mbytes_per_sec": 0 00:11:03.237 }, 00:11:03.237 "claimed": true, 00:11:03.237 "claim_type": "exclusive_write", 00:11:03.237 "zoned": false, 00:11:03.237 "supported_io_types": { 00:11:03.237 "read": true, 00:11:03.237 "write": true, 00:11:03.237 "unmap": true, 00:11:03.237 "flush": true, 00:11:03.237 "reset": true, 00:11:03.237 "nvme_admin": false, 00:11:03.237 "nvme_io": false, 00:11:03.237 "nvme_io_md": false, 00:11:03.237 "write_zeroes": true, 00:11:03.237 "zcopy": true, 00:11:03.237 "get_zone_info": false, 00:11:03.237 "zone_management": false, 00:11:03.237 "zone_append": false, 00:11:03.237 "compare": false, 00:11:03.237 "compare_and_write": false, 00:11:03.237 "abort": true, 00:11:03.237 "seek_hole": false, 00:11:03.237 "seek_data": false, 00:11:03.237 "copy": true, 00:11:03.237 "nvme_iov_md": false 00:11:03.237 }, 00:11:03.237 "memory_domains": [ 00:11:03.237 { 00:11:03.237 "dma_device_id": "system", 00:11:03.237 "dma_device_type": 1 00:11:03.237 }, 00:11:03.237 { 00:11:03.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:03.237 "dma_device_type": 2 00:11:03.237 } 00:11:03.237 ], 00:11:03.237 "driver_specific": {} 00:11:03.237 } 00:11:03.237 ] 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.237 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:03.496 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:03.496 "name": "Existed_Raid", 00:11:03.496 "uuid": "8f15dba0-15b9-4359-adab-b91046e8dba7", 00:11:03.496 "strip_size_kb": 64, 00:11:03.496 "state": "online", 00:11:03.496 "raid_level": "raid0", 00:11:03.496 "superblock": true, 00:11:03.496 "num_base_bdevs": 2, 00:11:03.496 "num_base_bdevs_discovered": 2, 00:11:03.496 "num_base_bdevs_operational": 2, 00:11:03.496 "base_bdevs_list": [ 00:11:03.496 { 00:11:03.496 "name": "BaseBdev1", 00:11:03.496 "uuid": "9b14e4e4-10ba-425a-8a09-1d55d8f966d4", 00:11:03.496 "is_configured": true, 00:11:03.496 "data_offset": 2048, 00:11:03.496 "data_size": 63488 00:11:03.496 }, 00:11:03.496 { 00:11:03.496 "name": "BaseBdev2", 00:11:03.496 "uuid": "b2df8824-3de1-476b-9d69-be398752a6fc", 00:11:03.496 "is_configured": true, 00:11:03.496 "data_offset": 2048, 00:11:03.496 "data_size": 63488 00:11:03.496 } 00:11:03.496 ] 00:11:03.496 }' 00:11:03.496 00:05:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:03.496 00:05:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:04.064 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:04.064 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:04.064 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:04.064 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:04.064 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:04.064 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:04.064 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:04.064 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:04.323 [2024-07-16 00:05:51.231833] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:04.323 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:04.323 "name": "Existed_Raid", 00:11:04.323 "aliases": [ 00:11:04.323 "8f15dba0-15b9-4359-adab-b91046e8dba7" 00:11:04.323 ], 00:11:04.323 "product_name": "Raid Volume", 00:11:04.323 "block_size": 512, 00:11:04.323 "num_blocks": 126976, 00:11:04.323 "uuid": "8f15dba0-15b9-4359-adab-b91046e8dba7", 00:11:04.323 "assigned_rate_limits": { 00:11:04.323 "rw_ios_per_sec": 0, 00:11:04.323 "rw_mbytes_per_sec": 0, 00:11:04.323 "r_mbytes_per_sec": 0, 00:11:04.323 "w_mbytes_per_sec": 0 00:11:04.323 }, 00:11:04.323 "claimed": false, 00:11:04.323 "zoned": false, 00:11:04.323 "supported_io_types": { 00:11:04.323 "read": true, 00:11:04.323 "write": true, 00:11:04.323 "unmap": true, 00:11:04.323 "flush": true, 00:11:04.323 "reset": true, 00:11:04.323 "nvme_admin": false, 00:11:04.323 "nvme_io": false, 00:11:04.323 "nvme_io_md": false, 00:11:04.323 "write_zeroes": true, 00:11:04.323 "zcopy": false, 00:11:04.323 "get_zone_info": false, 00:11:04.323 "zone_management": false, 00:11:04.323 "zone_append": false, 00:11:04.323 "compare": false, 00:11:04.323 "compare_and_write": false, 00:11:04.323 "abort": false, 00:11:04.323 "seek_hole": false, 00:11:04.323 "seek_data": false, 00:11:04.323 "copy": false, 00:11:04.323 "nvme_iov_md": false 00:11:04.323 }, 00:11:04.323 "memory_domains": [ 00:11:04.323 { 00:11:04.323 "dma_device_id": "system", 00:11:04.323 "dma_device_type": 1 00:11:04.323 }, 00:11:04.323 { 00:11:04.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:04.323 "dma_device_type": 2 00:11:04.323 }, 00:11:04.323 { 00:11:04.323 "dma_device_id": "system", 00:11:04.323 "dma_device_type": 1 00:11:04.323 }, 00:11:04.323 { 00:11:04.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:04.323 "dma_device_type": 2 00:11:04.323 } 00:11:04.323 ], 00:11:04.323 "driver_specific": { 00:11:04.323 "raid": { 00:11:04.323 "uuid": "8f15dba0-15b9-4359-adab-b91046e8dba7", 00:11:04.323 "strip_size_kb": 64, 00:11:04.323 "state": "online", 00:11:04.323 "raid_level": "raid0", 00:11:04.323 "superblock": true, 00:11:04.323 "num_base_bdevs": 2, 00:11:04.323 "num_base_bdevs_discovered": 2, 00:11:04.323 "num_base_bdevs_operational": 2, 00:11:04.323 "base_bdevs_list": [ 00:11:04.323 { 00:11:04.323 "name": "BaseBdev1", 00:11:04.323 "uuid": "9b14e4e4-10ba-425a-8a09-1d55d8f966d4", 00:11:04.323 "is_configured": true, 00:11:04.323 "data_offset": 2048, 00:11:04.323 "data_size": 63488 00:11:04.323 }, 00:11:04.323 { 00:11:04.323 "name": "BaseBdev2", 00:11:04.323 "uuid": "b2df8824-3de1-476b-9d69-be398752a6fc", 00:11:04.323 "is_configured": true, 00:11:04.323 "data_offset": 2048, 00:11:04.323 "data_size": 63488 00:11:04.323 } 00:11:04.323 ] 00:11:04.323 } 00:11:04.323 } 00:11:04.323 }' 00:11:04.323 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:04.583 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:04.583 BaseBdev2' 00:11:04.583 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:04.583 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:04.583 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:04.842 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:04.842 "name": "BaseBdev1", 00:11:04.842 "aliases": [ 00:11:04.842 "9b14e4e4-10ba-425a-8a09-1d55d8f966d4" 00:11:04.842 ], 00:11:04.843 "product_name": "Malloc disk", 00:11:04.843 "block_size": 512, 00:11:04.843 "num_blocks": 65536, 00:11:04.843 "uuid": "9b14e4e4-10ba-425a-8a09-1d55d8f966d4", 00:11:04.843 "assigned_rate_limits": { 00:11:04.843 "rw_ios_per_sec": 0, 00:11:04.843 "rw_mbytes_per_sec": 0, 00:11:04.843 "r_mbytes_per_sec": 0, 00:11:04.843 "w_mbytes_per_sec": 0 00:11:04.843 }, 00:11:04.843 "claimed": true, 00:11:04.843 "claim_type": "exclusive_write", 00:11:04.843 "zoned": false, 00:11:04.843 "supported_io_types": { 00:11:04.843 "read": true, 00:11:04.843 "write": true, 00:11:04.843 "unmap": true, 00:11:04.843 "flush": true, 00:11:04.843 "reset": true, 00:11:04.843 "nvme_admin": false, 00:11:04.843 "nvme_io": false, 00:11:04.843 "nvme_io_md": false, 00:11:04.843 "write_zeroes": true, 00:11:04.843 "zcopy": true, 00:11:04.843 "get_zone_info": false, 00:11:04.843 "zone_management": false, 00:11:04.843 "zone_append": false, 00:11:04.843 "compare": false, 00:11:04.843 "compare_and_write": false, 00:11:04.843 "abort": true, 00:11:04.843 "seek_hole": false, 00:11:04.843 "seek_data": false, 00:11:04.843 "copy": true, 00:11:04.843 "nvme_iov_md": false 00:11:04.843 }, 00:11:04.843 "memory_domains": [ 00:11:04.843 { 00:11:04.843 "dma_device_id": "system", 00:11:04.843 "dma_device_type": 1 00:11:04.843 }, 00:11:04.843 { 00:11:04.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:04.843 "dma_device_type": 2 00:11:04.843 } 00:11:04.843 ], 00:11:04.843 "driver_specific": {} 00:11:04.843 }' 00:11:04.843 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:04.843 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:04.843 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:04.843 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:04.843 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:04.843 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:04.843 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:05.103 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:05.103 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:05.103 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:05.103 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:05.103 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:05.103 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:05.103 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:05.103 00:05:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:05.362 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:05.363 "name": "BaseBdev2", 00:11:05.363 "aliases": [ 00:11:05.363 "b2df8824-3de1-476b-9d69-be398752a6fc" 00:11:05.363 ], 00:11:05.363 "product_name": "Malloc disk", 00:11:05.363 "block_size": 512, 00:11:05.363 "num_blocks": 65536, 00:11:05.363 "uuid": "b2df8824-3de1-476b-9d69-be398752a6fc", 00:11:05.363 "assigned_rate_limits": { 00:11:05.363 "rw_ios_per_sec": 0, 00:11:05.363 "rw_mbytes_per_sec": 0, 00:11:05.363 "r_mbytes_per_sec": 0, 00:11:05.363 "w_mbytes_per_sec": 0 00:11:05.363 }, 00:11:05.363 "claimed": true, 00:11:05.363 "claim_type": "exclusive_write", 00:11:05.363 "zoned": false, 00:11:05.363 "supported_io_types": { 00:11:05.363 "read": true, 00:11:05.363 "write": true, 00:11:05.363 "unmap": true, 00:11:05.363 "flush": true, 00:11:05.363 "reset": true, 00:11:05.363 "nvme_admin": false, 00:11:05.363 "nvme_io": false, 00:11:05.363 "nvme_io_md": false, 00:11:05.363 "write_zeroes": true, 00:11:05.363 "zcopy": true, 00:11:05.363 "get_zone_info": false, 00:11:05.363 "zone_management": false, 00:11:05.363 "zone_append": false, 00:11:05.363 "compare": false, 00:11:05.363 "compare_and_write": false, 00:11:05.363 "abort": true, 00:11:05.363 "seek_hole": false, 00:11:05.363 "seek_data": false, 00:11:05.363 "copy": true, 00:11:05.363 "nvme_iov_md": false 00:11:05.363 }, 00:11:05.363 "memory_domains": [ 00:11:05.363 { 00:11:05.363 "dma_device_id": "system", 00:11:05.363 "dma_device_type": 1 00:11:05.363 }, 00:11:05.363 { 00:11:05.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.363 "dma_device_type": 2 00:11:05.363 } 00:11:05.363 ], 00:11:05.363 "driver_specific": {} 00:11:05.363 }' 00:11:05.363 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:05.363 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:05.363 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:05.363 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:05.363 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:05.622 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:05.622 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:05.622 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:05.622 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:05.622 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:05.622 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:05.622 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:05.622 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:05.882 [2024-07-16 00:05:52.779730] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:05.882 [2024-07-16 00:05:52.779756] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:05.882 [2024-07-16 00:05:52.779795] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:05.882 00:05:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:06.142 00:05:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:06.142 "name": "Existed_Raid", 00:11:06.142 "uuid": "8f15dba0-15b9-4359-adab-b91046e8dba7", 00:11:06.142 "strip_size_kb": 64, 00:11:06.142 "state": "offline", 00:11:06.142 "raid_level": "raid0", 00:11:06.142 "superblock": true, 00:11:06.142 "num_base_bdevs": 2, 00:11:06.142 "num_base_bdevs_discovered": 1, 00:11:06.142 "num_base_bdevs_operational": 1, 00:11:06.142 "base_bdevs_list": [ 00:11:06.142 { 00:11:06.142 "name": null, 00:11:06.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:06.142 "is_configured": false, 00:11:06.142 "data_offset": 2048, 00:11:06.142 "data_size": 63488 00:11:06.142 }, 00:11:06.142 { 00:11:06.142 "name": "BaseBdev2", 00:11:06.142 "uuid": "b2df8824-3de1-476b-9d69-be398752a6fc", 00:11:06.142 "is_configured": true, 00:11:06.142 "data_offset": 2048, 00:11:06.142 "data_size": 63488 00:11:06.142 } 00:11:06.142 ] 00:11:06.142 }' 00:11:06.142 00:05:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:06.142 00:05:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:07.076 00:05:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:07.076 00:05:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:07.076 00:05:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.076 00:05:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:07.076 00:05:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:07.076 00:05:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:07.076 00:05:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:07.335 [2024-07-16 00:05:54.149282] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:07.335 [2024-07-16 00:05:54.149329] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe91000 name Existed_Raid, state offline 00:11:07.335 00:05:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:07.335 00:05:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:07.335 00:05:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.335 00:05:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3488942 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3488942 ']' 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3488942 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3488942 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3488942' 00:11:07.594 killing process with pid 3488942 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3488942 00:11:07.594 [2024-07-16 00:05:54.479278] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:07.594 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3488942 00:11:07.594 [2024-07-16 00:05:54.480430] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:07.854 00:05:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:07.854 00:11:07.854 real 0m9.695s 00:11:07.854 user 0m17.597s 00:11:07.854 sys 0m1.906s 00:11:07.854 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:07.854 00:05:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:07.854 ************************************ 00:11:07.854 END TEST raid_state_function_test_sb 00:11:07.854 ************************************ 00:11:07.854 00:05:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:07.854 00:05:54 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:11:07.854 00:05:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:07.854 00:05:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:07.854 00:05:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:07.854 ************************************ 00:11:07.854 START TEST raid_superblock_test 00:11:07.854 ************************************ 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3490403 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3490403 /var/tmp/spdk-raid.sock 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3490403 ']' 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:07.854 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:07.854 00:05:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.113 [2024-07-16 00:05:54.851771] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:11:08.113 [2024-07-16 00:05:54.851845] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3490403 ] 00:11:08.113 [2024-07-16 00:05:54.984777] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:08.373 [2024-07-16 00:05:55.092162] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:08.373 [2024-07-16 00:05:55.150521] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:08.373 [2024-07-16 00:05:55.150553] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:08.940 00:05:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:09.199 malloc1 00:11:09.199 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:09.458 [2024-07-16 00:05:56.271138] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:09.458 [2024-07-16 00:05:56.271187] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:09.458 [2024-07-16 00:05:56.271206] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1179570 00:11:09.458 [2024-07-16 00:05:56.271218] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:09.458 [2024-07-16 00:05:56.272763] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:09.458 [2024-07-16 00:05:56.272790] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:09.458 pt1 00:11:09.458 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:09.458 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:09.458 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:09.458 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:09.458 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:09.458 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:09.458 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:09.458 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:09.458 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:09.717 malloc2 00:11:09.717 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:09.976 [2024-07-16 00:05:56.769330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:09.976 [2024-07-16 00:05:56.769376] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:09.976 [2024-07-16 00:05:56.769393] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x117a970 00:11:09.976 [2024-07-16 00:05:56.769405] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:09.976 [2024-07-16 00:05:56.770941] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:09.976 [2024-07-16 00:05:56.770971] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:09.976 pt2 00:11:09.976 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:09.976 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:09.976 00:05:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:10.236 [2024-07-16 00:05:56.969882] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:10.236 [2024-07-16 00:05:56.971123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:10.236 [2024-07-16 00:05:56.971264] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x131d270 00:11:10.236 [2024-07-16 00:05:56.971276] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:10.236 [2024-07-16 00:05:56.971470] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1312c10 00:11:10.236 [2024-07-16 00:05:56.971611] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x131d270 00:11:10.236 [2024-07-16 00:05:56.971621] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x131d270 00:11:10.236 [2024-07-16 00:05:56.971717] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.236 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:10.495 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:10.495 "name": "raid_bdev1", 00:11:10.495 "uuid": "ab553135-779b-4aa7-97a0-777521dbeca9", 00:11:10.495 "strip_size_kb": 64, 00:11:10.495 "state": "online", 00:11:10.495 "raid_level": "raid0", 00:11:10.495 "superblock": true, 00:11:10.495 "num_base_bdevs": 2, 00:11:10.495 "num_base_bdevs_discovered": 2, 00:11:10.495 "num_base_bdevs_operational": 2, 00:11:10.495 "base_bdevs_list": [ 00:11:10.495 { 00:11:10.495 "name": "pt1", 00:11:10.495 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:10.495 "is_configured": true, 00:11:10.495 "data_offset": 2048, 00:11:10.495 "data_size": 63488 00:11:10.495 }, 00:11:10.495 { 00:11:10.495 "name": "pt2", 00:11:10.495 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:10.495 "is_configured": true, 00:11:10.495 "data_offset": 2048, 00:11:10.495 "data_size": 63488 00:11:10.495 } 00:11:10.495 ] 00:11:10.495 }' 00:11:10.495 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:10.495 00:05:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.102 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:11.102 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:11.102 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:11.102 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:11.102 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:11.102 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:11.102 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:11.102 00:05:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:11.378 [2024-07-16 00:05:58.060994] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:11.378 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:11.378 "name": "raid_bdev1", 00:11:11.378 "aliases": [ 00:11:11.378 "ab553135-779b-4aa7-97a0-777521dbeca9" 00:11:11.378 ], 00:11:11.378 "product_name": "Raid Volume", 00:11:11.378 "block_size": 512, 00:11:11.378 "num_blocks": 126976, 00:11:11.378 "uuid": "ab553135-779b-4aa7-97a0-777521dbeca9", 00:11:11.378 "assigned_rate_limits": { 00:11:11.378 "rw_ios_per_sec": 0, 00:11:11.378 "rw_mbytes_per_sec": 0, 00:11:11.378 "r_mbytes_per_sec": 0, 00:11:11.378 "w_mbytes_per_sec": 0 00:11:11.378 }, 00:11:11.378 "claimed": false, 00:11:11.378 "zoned": false, 00:11:11.378 "supported_io_types": { 00:11:11.378 "read": true, 00:11:11.379 "write": true, 00:11:11.379 "unmap": true, 00:11:11.379 "flush": true, 00:11:11.379 "reset": true, 00:11:11.379 "nvme_admin": false, 00:11:11.379 "nvme_io": false, 00:11:11.379 "nvme_io_md": false, 00:11:11.379 "write_zeroes": true, 00:11:11.379 "zcopy": false, 00:11:11.379 "get_zone_info": false, 00:11:11.379 "zone_management": false, 00:11:11.379 "zone_append": false, 00:11:11.379 "compare": false, 00:11:11.379 "compare_and_write": false, 00:11:11.379 "abort": false, 00:11:11.379 "seek_hole": false, 00:11:11.379 "seek_data": false, 00:11:11.379 "copy": false, 00:11:11.379 "nvme_iov_md": false 00:11:11.379 }, 00:11:11.379 "memory_domains": [ 00:11:11.379 { 00:11:11.379 "dma_device_id": "system", 00:11:11.379 "dma_device_type": 1 00:11:11.379 }, 00:11:11.379 { 00:11:11.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:11.379 "dma_device_type": 2 00:11:11.379 }, 00:11:11.379 { 00:11:11.379 "dma_device_id": "system", 00:11:11.379 "dma_device_type": 1 00:11:11.379 }, 00:11:11.379 { 00:11:11.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:11.379 "dma_device_type": 2 00:11:11.379 } 00:11:11.379 ], 00:11:11.379 "driver_specific": { 00:11:11.379 "raid": { 00:11:11.379 "uuid": "ab553135-779b-4aa7-97a0-777521dbeca9", 00:11:11.379 "strip_size_kb": 64, 00:11:11.379 "state": "online", 00:11:11.379 "raid_level": "raid0", 00:11:11.379 "superblock": true, 00:11:11.379 "num_base_bdevs": 2, 00:11:11.379 "num_base_bdevs_discovered": 2, 00:11:11.379 "num_base_bdevs_operational": 2, 00:11:11.379 "base_bdevs_list": [ 00:11:11.379 { 00:11:11.379 "name": "pt1", 00:11:11.379 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:11.379 "is_configured": true, 00:11:11.379 "data_offset": 2048, 00:11:11.379 "data_size": 63488 00:11:11.379 }, 00:11:11.379 { 00:11:11.379 "name": "pt2", 00:11:11.379 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:11.379 "is_configured": true, 00:11:11.379 "data_offset": 2048, 00:11:11.379 "data_size": 63488 00:11:11.379 } 00:11:11.379 ] 00:11:11.379 } 00:11:11.379 } 00:11:11.379 }' 00:11:11.379 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:11.379 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:11.379 pt2' 00:11:11.379 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:11.379 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:11.379 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:11.638 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:11.638 "name": "pt1", 00:11:11.638 "aliases": [ 00:11:11.638 "00000000-0000-0000-0000-000000000001" 00:11:11.638 ], 00:11:11.638 "product_name": "passthru", 00:11:11.638 "block_size": 512, 00:11:11.638 "num_blocks": 65536, 00:11:11.638 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:11.638 "assigned_rate_limits": { 00:11:11.638 "rw_ios_per_sec": 0, 00:11:11.638 "rw_mbytes_per_sec": 0, 00:11:11.638 "r_mbytes_per_sec": 0, 00:11:11.638 "w_mbytes_per_sec": 0 00:11:11.638 }, 00:11:11.638 "claimed": true, 00:11:11.638 "claim_type": "exclusive_write", 00:11:11.638 "zoned": false, 00:11:11.638 "supported_io_types": { 00:11:11.638 "read": true, 00:11:11.638 "write": true, 00:11:11.638 "unmap": true, 00:11:11.638 "flush": true, 00:11:11.638 "reset": true, 00:11:11.638 "nvme_admin": false, 00:11:11.638 "nvme_io": false, 00:11:11.638 "nvme_io_md": false, 00:11:11.638 "write_zeroes": true, 00:11:11.638 "zcopy": true, 00:11:11.638 "get_zone_info": false, 00:11:11.638 "zone_management": false, 00:11:11.638 "zone_append": false, 00:11:11.638 "compare": false, 00:11:11.638 "compare_and_write": false, 00:11:11.638 "abort": true, 00:11:11.638 "seek_hole": false, 00:11:11.638 "seek_data": false, 00:11:11.638 "copy": true, 00:11:11.638 "nvme_iov_md": false 00:11:11.638 }, 00:11:11.638 "memory_domains": [ 00:11:11.638 { 00:11:11.638 "dma_device_id": "system", 00:11:11.638 "dma_device_type": 1 00:11:11.638 }, 00:11:11.638 { 00:11:11.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:11.638 "dma_device_type": 2 00:11:11.638 } 00:11:11.638 ], 00:11:11.638 "driver_specific": { 00:11:11.638 "passthru": { 00:11:11.638 "name": "pt1", 00:11:11.638 "base_bdev_name": "malloc1" 00:11:11.638 } 00:11:11.638 } 00:11:11.638 }' 00:11:11.638 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:11.638 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:11.638 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:11.638 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:11.638 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:11.897 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:11.897 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:11.897 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:11.897 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:11.897 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:11.897 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:11.897 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:11.897 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:11.897 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:11.897 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:12.156 00:05:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:12.156 "name": "pt2", 00:11:12.156 "aliases": [ 00:11:12.156 "00000000-0000-0000-0000-000000000002" 00:11:12.156 ], 00:11:12.156 "product_name": "passthru", 00:11:12.156 "block_size": 512, 00:11:12.156 "num_blocks": 65536, 00:11:12.156 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:12.156 "assigned_rate_limits": { 00:11:12.156 "rw_ios_per_sec": 0, 00:11:12.156 "rw_mbytes_per_sec": 0, 00:11:12.156 "r_mbytes_per_sec": 0, 00:11:12.156 "w_mbytes_per_sec": 0 00:11:12.156 }, 00:11:12.156 "claimed": true, 00:11:12.156 "claim_type": "exclusive_write", 00:11:12.156 "zoned": false, 00:11:12.156 "supported_io_types": { 00:11:12.156 "read": true, 00:11:12.156 "write": true, 00:11:12.156 "unmap": true, 00:11:12.156 "flush": true, 00:11:12.156 "reset": true, 00:11:12.156 "nvme_admin": false, 00:11:12.156 "nvme_io": false, 00:11:12.156 "nvme_io_md": false, 00:11:12.156 "write_zeroes": true, 00:11:12.156 "zcopy": true, 00:11:12.156 "get_zone_info": false, 00:11:12.156 "zone_management": false, 00:11:12.156 "zone_append": false, 00:11:12.156 "compare": false, 00:11:12.156 "compare_and_write": false, 00:11:12.156 "abort": true, 00:11:12.156 "seek_hole": false, 00:11:12.156 "seek_data": false, 00:11:12.156 "copy": true, 00:11:12.156 "nvme_iov_md": false 00:11:12.156 }, 00:11:12.156 "memory_domains": [ 00:11:12.156 { 00:11:12.156 "dma_device_id": "system", 00:11:12.156 "dma_device_type": 1 00:11:12.156 }, 00:11:12.156 { 00:11:12.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.156 "dma_device_type": 2 00:11:12.156 } 00:11:12.156 ], 00:11:12.156 "driver_specific": { 00:11:12.156 "passthru": { 00:11:12.156 "name": "pt2", 00:11:12.156 "base_bdev_name": "malloc2" 00:11:12.156 } 00:11:12.156 } 00:11:12.156 }' 00:11:12.156 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:12.156 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:12.156 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:12.156 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:12.415 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:12.415 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:12.415 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:12.415 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:12.415 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:12.415 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:12.415 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:12.416 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:12.416 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:12.416 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:12.674 [2024-07-16 00:05:59.577006] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:12.674 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ab553135-779b-4aa7-97a0-777521dbeca9 00:11:12.675 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z ab553135-779b-4aa7-97a0-777521dbeca9 ']' 00:11:12.675 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:12.933 [2024-07-16 00:05:59.821403] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:12.933 [2024-07-16 00:05:59.821424] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:12.933 [2024-07-16 00:05:59.821476] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:12.933 [2024-07-16 00:05:59.821518] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:12.933 [2024-07-16 00:05:59.821530] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x131d270 name raid_bdev1, state offline 00:11:12.933 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.933 00:05:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:13.193 00:06:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:13.193 00:06:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:13.193 00:06:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:13.193 00:06:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:13.451 00:06:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:13.451 00:06:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:13.710 00:06:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:13.710 00:06:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:13.969 00:06:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:14.228 [2024-07-16 00:06:01.088712] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:14.228 [2024-07-16 00:06:01.090119] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:14.228 [2024-07-16 00:06:01.090177] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:14.228 [2024-07-16 00:06:01.090219] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:14.228 [2024-07-16 00:06:01.090238] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:14.228 [2024-07-16 00:06:01.090247] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x131cff0 name raid_bdev1, state configuring 00:11:14.228 request: 00:11:14.228 { 00:11:14.228 "name": "raid_bdev1", 00:11:14.228 "raid_level": "raid0", 00:11:14.228 "base_bdevs": [ 00:11:14.228 "malloc1", 00:11:14.228 "malloc2" 00:11:14.228 ], 00:11:14.228 "strip_size_kb": 64, 00:11:14.228 "superblock": false, 00:11:14.228 "method": "bdev_raid_create", 00:11:14.228 "req_id": 1 00:11:14.228 } 00:11:14.228 Got JSON-RPC error response 00:11:14.228 response: 00:11:14.228 { 00:11:14.228 "code": -17, 00:11:14.228 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:14.228 } 00:11:14.228 00:06:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:14.228 00:06:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:14.228 00:06:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:14.228 00:06:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:14.228 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.228 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:14.487 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:14.487 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:14.487 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:14.746 [2024-07-16 00:06:01.577932] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:14.746 [2024-07-16 00:06:01.577974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:14.746 [2024-07-16 00:06:01.577995] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11797a0 00:11:14.746 [2024-07-16 00:06:01.578007] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:14.746 [2024-07-16 00:06:01.579602] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:14.746 [2024-07-16 00:06:01.579632] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:14.746 [2024-07-16 00:06:01.579699] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:14.746 [2024-07-16 00:06:01.579725] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:14.746 pt1 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.746 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:15.005 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:15.005 "name": "raid_bdev1", 00:11:15.005 "uuid": "ab553135-779b-4aa7-97a0-777521dbeca9", 00:11:15.005 "strip_size_kb": 64, 00:11:15.005 "state": "configuring", 00:11:15.005 "raid_level": "raid0", 00:11:15.005 "superblock": true, 00:11:15.005 "num_base_bdevs": 2, 00:11:15.005 "num_base_bdevs_discovered": 1, 00:11:15.005 "num_base_bdevs_operational": 2, 00:11:15.005 "base_bdevs_list": [ 00:11:15.005 { 00:11:15.005 "name": "pt1", 00:11:15.005 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:15.005 "is_configured": true, 00:11:15.005 "data_offset": 2048, 00:11:15.005 "data_size": 63488 00:11:15.005 }, 00:11:15.005 { 00:11:15.005 "name": null, 00:11:15.005 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:15.005 "is_configured": false, 00:11:15.005 "data_offset": 2048, 00:11:15.005 "data_size": 63488 00:11:15.005 } 00:11:15.005 ] 00:11:15.005 }' 00:11:15.005 00:06:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:15.005 00:06:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.572 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:15.572 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:15.572 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:15.572 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:15.831 [2024-07-16 00:06:02.708960] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:15.831 [2024-07-16 00:06:02.709009] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:15.831 [2024-07-16 00:06:02.709026] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1313820 00:11:15.831 [2024-07-16 00:06:02.709039] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:15.831 [2024-07-16 00:06:02.709388] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:15.831 [2024-07-16 00:06:02.709405] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:15.831 [2024-07-16 00:06:02.709469] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:15.831 [2024-07-16 00:06:02.709488] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:15.831 [2024-07-16 00:06:02.709582] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x116fec0 00:11:15.831 [2024-07-16 00:06:02.709592] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:15.831 [2024-07-16 00:06:02.709760] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1172530 00:11:15.831 [2024-07-16 00:06:02.709878] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116fec0 00:11:15.831 [2024-07-16 00:06:02.709888] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x116fec0 00:11:15.831 [2024-07-16 00:06:02.709996] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:15.831 pt2 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.831 00:06:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:16.090 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:16.090 "name": "raid_bdev1", 00:11:16.090 "uuid": "ab553135-779b-4aa7-97a0-777521dbeca9", 00:11:16.090 "strip_size_kb": 64, 00:11:16.090 "state": "online", 00:11:16.090 "raid_level": "raid0", 00:11:16.090 "superblock": true, 00:11:16.090 "num_base_bdevs": 2, 00:11:16.090 "num_base_bdevs_discovered": 2, 00:11:16.090 "num_base_bdevs_operational": 2, 00:11:16.090 "base_bdevs_list": [ 00:11:16.090 { 00:11:16.090 "name": "pt1", 00:11:16.090 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:16.090 "is_configured": true, 00:11:16.090 "data_offset": 2048, 00:11:16.090 "data_size": 63488 00:11:16.090 }, 00:11:16.090 { 00:11:16.090 "name": "pt2", 00:11:16.090 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:16.090 "is_configured": true, 00:11:16.090 "data_offset": 2048, 00:11:16.090 "data_size": 63488 00:11:16.090 } 00:11:16.090 ] 00:11:16.090 }' 00:11:16.090 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:16.090 00:06:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.024 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:17.024 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:17.024 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:17.024 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:17.024 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:17.024 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:17.025 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:17.025 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:17.025 [2024-07-16 00:06:03.832188] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:17.025 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:17.025 "name": "raid_bdev1", 00:11:17.025 "aliases": [ 00:11:17.025 "ab553135-779b-4aa7-97a0-777521dbeca9" 00:11:17.025 ], 00:11:17.025 "product_name": "Raid Volume", 00:11:17.025 "block_size": 512, 00:11:17.025 "num_blocks": 126976, 00:11:17.025 "uuid": "ab553135-779b-4aa7-97a0-777521dbeca9", 00:11:17.025 "assigned_rate_limits": { 00:11:17.025 "rw_ios_per_sec": 0, 00:11:17.025 "rw_mbytes_per_sec": 0, 00:11:17.025 "r_mbytes_per_sec": 0, 00:11:17.025 "w_mbytes_per_sec": 0 00:11:17.025 }, 00:11:17.025 "claimed": false, 00:11:17.025 "zoned": false, 00:11:17.025 "supported_io_types": { 00:11:17.025 "read": true, 00:11:17.025 "write": true, 00:11:17.025 "unmap": true, 00:11:17.025 "flush": true, 00:11:17.025 "reset": true, 00:11:17.025 "nvme_admin": false, 00:11:17.025 "nvme_io": false, 00:11:17.025 "nvme_io_md": false, 00:11:17.025 "write_zeroes": true, 00:11:17.025 "zcopy": false, 00:11:17.025 "get_zone_info": false, 00:11:17.025 "zone_management": false, 00:11:17.025 "zone_append": false, 00:11:17.025 "compare": false, 00:11:17.025 "compare_and_write": false, 00:11:17.025 "abort": false, 00:11:17.025 "seek_hole": false, 00:11:17.025 "seek_data": false, 00:11:17.025 "copy": false, 00:11:17.025 "nvme_iov_md": false 00:11:17.025 }, 00:11:17.025 "memory_domains": [ 00:11:17.025 { 00:11:17.025 "dma_device_id": "system", 00:11:17.025 "dma_device_type": 1 00:11:17.025 }, 00:11:17.025 { 00:11:17.025 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.025 "dma_device_type": 2 00:11:17.025 }, 00:11:17.025 { 00:11:17.025 "dma_device_id": "system", 00:11:17.025 "dma_device_type": 1 00:11:17.025 }, 00:11:17.025 { 00:11:17.025 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.025 "dma_device_type": 2 00:11:17.025 } 00:11:17.025 ], 00:11:17.025 "driver_specific": { 00:11:17.025 "raid": { 00:11:17.025 "uuid": "ab553135-779b-4aa7-97a0-777521dbeca9", 00:11:17.025 "strip_size_kb": 64, 00:11:17.025 "state": "online", 00:11:17.025 "raid_level": "raid0", 00:11:17.025 "superblock": true, 00:11:17.025 "num_base_bdevs": 2, 00:11:17.025 "num_base_bdevs_discovered": 2, 00:11:17.025 "num_base_bdevs_operational": 2, 00:11:17.025 "base_bdevs_list": [ 00:11:17.025 { 00:11:17.025 "name": "pt1", 00:11:17.025 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:17.025 "is_configured": true, 00:11:17.025 "data_offset": 2048, 00:11:17.025 "data_size": 63488 00:11:17.025 }, 00:11:17.025 { 00:11:17.025 "name": "pt2", 00:11:17.025 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:17.025 "is_configured": true, 00:11:17.025 "data_offset": 2048, 00:11:17.025 "data_size": 63488 00:11:17.025 } 00:11:17.025 ] 00:11:17.025 } 00:11:17.025 } 00:11:17.025 }' 00:11:17.025 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:17.025 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:17.025 pt2' 00:11:17.025 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:17.025 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:17.025 00:06:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:17.284 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:17.284 "name": "pt1", 00:11:17.284 "aliases": [ 00:11:17.284 "00000000-0000-0000-0000-000000000001" 00:11:17.284 ], 00:11:17.284 "product_name": "passthru", 00:11:17.284 "block_size": 512, 00:11:17.284 "num_blocks": 65536, 00:11:17.284 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:17.284 "assigned_rate_limits": { 00:11:17.284 "rw_ios_per_sec": 0, 00:11:17.284 "rw_mbytes_per_sec": 0, 00:11:17.284 "r_mbytes_per_sec": 0, 00:11:17.284 "w_mbytes_per_sec": 0 00:11:17.284 }, 00:11:17.284 "claimed": true, 00:11:17.284 "claim_type": "exclusive_write", 00:11:17.284 "zoned": false, 00:11:17.284 "supported_io_types": { 00:11:17.284 "read": true, 00:11:17.284 "write": true, 00:11:17.284 "unmap": true, 00:11:17.284 "flush": true, 00:11:17.284 "reset": true, 00:11:17.284 "nvme_admin": false, 00:11:17.284 "nvme_io": false, 00:11:17.284 "nvme_io_md": false, 00:11:17.285 "write_zeroes": true, 00:11:17.285 "zcopy": true, 00:11:17.285 "get_zone_info": false, 00:11:17.285 "zone_management": false, 00:11:17.285 "zone_append": false, 00:11:17.285 "compare": false, 00:11:17.285 "compare_and_write": false, 00:11:17.285 "abort": true, 00:11:17.285 "seek_hole": false, 00:11:17.285 "seek_data": false, 00:11:17.285 "copy": true, 00:11:17.285 "nvme_iov_md": false 00:11:17.285 }, 00:11:17.285 "memory_domains": [ 00:11:17.285 { 00:11:17.285 "dma_device_id": "system", 00:11:17.285 "dma_device_type": 1 00:11:17.285 }, 00:11:17.285 { 00:11:17.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.285 "dma_device_type": 2 00:11:17.285 } 00:11:17.285 ], 00:11:17.285 "driver_specific": { 00:11:17.285 "passthru": { 00:11:17.285 "name": "pt1", 00:11:17.285 "base_bdev_name": "malloc1" 00:11:17.285 } 00:11:17.285 } 00:11:17.285 }' 00:11:17.285 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:17.285 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:17.544 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:17.544 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:17.544 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:17.544 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:17.544 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:17.544 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:17.544 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:17.544 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:17.544 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:17.803 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:17.803 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:17.803 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:17.803 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:18.062 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:18.062 "name": "pt2", 00:11:18.062 "aliases": [ 00:11:18.062 "00000000-0000-0000-0000-000000000002" 00:11:18.062 ], 00:11:18.062 "product_name": "passthru", 00:11:18.062 "block_size": 512, 00:11:18.062 "num_blocks": 65536, 00:11:18.062 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:18.062 "assigned_rate_limits": { 00:11:18.062 "rw_ios_per_sec": 0, 00:11:18.062 "rw_mbytes_per_sec": 0, 00:11:18.062 "r_mbytes_per_sec": 0, 00:11:18.062 "w_mbytes_per_sec": 0 00:11:18.062 }, 00:11:18.062 "claimed": true, 00:11:18.062 "claim_type": "exclusive_write", 00:11:18.062 "zoned": false, 00:11:18.062 "supported_io_types": { 00:11:18.062 "read": true, 00:11:18.062 "write": true, 00:11:18.062 "unmap": true, 00:11:18.062 "flush": true, 00:11:18.062 "reset": true, 00:11:18.062 "nvme_admin": false, 00:11:18.062 "nvme_io": false, 00:11:18.062 "nvme_io_md": false, 00:11:18.062 "write_zeroes": true, 00:11:18.062 "zcopy": true, 00:11:18.062 "get_zone_info": false, 00:11:18.062 "zone_management": false, 00:11:18.062 "zone_append": false, 00:11:18.062 "compare": false, 00:11:18.062 "compare_and_write": false, 00:11:18.062 "abort": true, 00:11:18.062 "seek_hole": false, 00:11:18.062 "seek_data": false, 00:11:18.062 "copy": true, 00:11:18.062 "nvme_iov_md": false 00:11:18.062 }, 00:11:18.062 "memory_domains": [ 00:11:18.062 { 00:11:18.062 "dma_device_id": "system", 00:11:18.062 "dma_device_type": 1 00:11:18.062 }, 00:11:18.062 { 00:11:18.062 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.062 "dma_device_type": 2 00:11:18.062 } 00:11:18.062 ], 00:11:18.062 "driver_specific": { 00:11:18.062 "passthru": { 00:11:18.062 "name": "pt2", 00:11:18.062 "base_bdev_name": "malloc2" 00:11:18.063 } 00:11:18.063 } 00:11:18.063 }' 00:11:18.063 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:18.063 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:18.063 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:18.063 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:18.063 00:06:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:18.063 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:18.063 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:18.321 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:18.321 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:18.321 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:18.321 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:18.321 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:18.321 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:18.321 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:18.889 [2024-07-16 00:06:05.657013] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' ab553135-779b-4aa7-97a0-777521dbeca9 '!=' ab553135-779b-4aa7-97a0-777521dbeca9 ']' 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3490403 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3490403 ']' 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3490403 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3490403 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:18.889 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:18.890 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3490403' 00:11:18.890 killing process with pid 3490403 00:11:18.890 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3490403 00:11:18.890 [2024-07-16 00:06:05.743561] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:18.890 [2024-07-16 00:06:05.743615] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:18.890 [2024-07-16 00:06:05.743659] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:18.890 [2024-07-16 00:06:05.743671] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116fec0 name raid_bdev1, state offline 00:11:18.890 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3490403 00:11:18.890 [2024-07-16 00:06:05.762354] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:19.148 00:06:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:19.148 00:11:19.148 real 0m11.204s 00:11:19.148 user 0m20.071s 00:11:19.148 sys 0m2.025s 00:11:19.148 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:19.148 00:06:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.148 ************************************ 00:11:19.148 END TEST raid_superblock_test 00:11:19.148 ************************************ 00:11:19.148 00:06:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:19.148 00:06:06 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:11:19.148 00:06:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:19.148 00:06:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:19.148 00:06:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:19.148 ************************************ 00:11:19.148 START TEST raid_read_error_test 00:11:19.148 ************************************ 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.F2N0N3OhDi 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3492503 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3492503 /var/tmp/spdk-raid.sock 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3492503 ']' 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:19.148 00:06:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:19.149 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:19.149 00:06:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:19.149 00:06:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.407 [2024-07-16 00:06:06.151938] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:11:19.407 [2024-07-16 00:06:06.152007] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3492503 ] 00:11:19.407 [2024-07-16 00:06:06.281584] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:19.666 [2024-07-16 00:06:06.392392] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:19.666 [2024-07-16 00:06:06.454931] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:19.666 [2024-07-16 00:06:06.454960] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:20.234 00:06:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:20.234 00:06:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:20.234 00:06:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:20.234 00:06:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:20.493 BaseBdev1_malloc 00:11:20.493 00:06:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:20.752 true 00:11:20.752 00:06:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:21.321 [2024-07-16 00:06:08.104749] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:21.321 [2024-07-16 00:06:08.104797] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:21.321 [2024-07-16 00:06:08.104818] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26660d0 00:11:21.321 [2024-07-16 00:06:08.104831] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:21.321 [2024-07-16 00:06:08.106741] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:21.321 [2024-07-16 00:06:08.106770] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:21.321 BaseBdev1 00:11:21.321 00:06:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:21.321 00:06:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:21.887 BaseBdev2_malloc 00:11:21.887 00:06:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:22.146 true 00:11:22.146 00:06:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:22.722 [2024-07-16 00:06:09.430070] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:22.722 [2024-07-16 00:06:09.430115] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:22.722 [2024-07-16 00:06:09.430137] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x266a910 00:11:22.722 [2024-07-16 00:06:09.430149] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:22.722 [2024-07-16 00:06:09.431756] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:22.722 [2024-07-16 00:06:09.431784] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:22.722 BaseBdev2 00:11:22.722 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:23.035 [2024-07-16 00:06:09.943445] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:23.035 [2024-07-16 00:06:09.944828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:23.035 [2024-07-16 00:06:09.945029] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x266c320 00:11:23.035 [2024-07-16 00:06:09.945043] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:23.035 [2024-07-16 00:06:09.945241] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x266b270 00:11:23.035 [2024-07-16 00:06:09.945385] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x266c320 00:11:23.035 [2024-07-16 00:06:09.945395] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x266c320 00:11:23.035 [2024-07-16 00:06:09.945502] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:23.035 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.036 00:06:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:23.601 00:06:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:23.601 "name": "raid_bdev1", 00:11:23.601 "uuid": "2b15e6a3-b963-4d3c-9a43-cdbe59448807", 00:11:23.601 "strip_size_kb": 64, 00:11:23.601 "state": "online", 00:11:23.601 "raid_level": "raid0", 00:11:23.601 "superblock": true, 00:11:23.601 "num_base_bdevs": 2, 00:11:23.601 "num_base_bdevs_discovered": 2, 00:11:23.601 "num_base_bdevs_operational": 2, 00:11:23.601 "base_bdevs_list": [ 00:11:23.601 { 00:11:23.601 "name": "BaseBdev1", 00:11:23.601 "uuid": "660e33a2-1e61-5613-9740-0558e9887e95", 00:11:23.601 "is_configured": true, 00:11:23.601 "data_offset": 2048, 00:11:23.601 "data_size": 63488 00:11:23.601 }, 00:11:23.601 { 00:11:23.601 "name": "BaseBdev2", 00:11:23.601 "uuid": "be056001-e943-59ad-81bd-564e9233e63d", 00:11:23.601 "is_configured": true, 00:11:23.601 "data_offset": 2048, 00:11:23.601 "data_size": 63488 00:11:23.601 } 00:11:23.601 ] 00:11:23.601 }' 00:11:23.601 00:06:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:23.601 00:06:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:24.536 00:06:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:24.536 00:06:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:24.536 [2024-07-16 00:06:11.251196] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26679b0 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.472 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:25.731 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.731 "name": "raid_bdev1", 00:11:25.731 "uuid": "2b15e6a3-b963-4d3c-9a43-cdbe59448807", 00:11:25.731 "strip_size_kb": 64, 00:11:25.731 "state": "online", 00:11:25.731 "raid_level": "raid0", 00:11:25.731 "superblock": true, 00:11:25.731 "num_base_bdevs": 2, 00:11:25.731 "num_base_bdevs_discovered": 2, 00:11:25.731 "num_base_bdevs_operational": 2, 00:11:25.731 "base_bdevs_list": [ 00:11:25.731 { 00:11:25.731 "name": "BaseBdev1", 00:11:25.731 "uuid": "660e33a2-1e61-5613-9740-0558e9887e95", 00:11:25.731 "is_configured": true, 00:11:25.731 "data_offset": 2048, 00:11:25.731 "data_size": 63488 00:11:25.731 }, 00:11:25.731 { 00:11:25.731 "name": "BaseBdev2", 00:11:25.731 "uuid": "be056001-e943-59ad-81bd-564e9233e63d", 00:11:25.731 "is_configured": true, 00:11:25.731 "data_offset": 2048, 00:11:25.731 "data_size": 63488 00:11:25.731 } 00:11:25.731 ] 00:11:25.731 }' 00:11:25.731 00:06:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.731 00:06:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.699 00:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:26.958 [2024-07-16 00:06:13.760449] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:26.958 [2024-07-16 00:06:13.760482] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:26.958 [2024-07-16 00:06:13.763651] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:26.958 [2024-07-16 00:06:13.763680] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:26.958 [2024-07-16 00:06:13.763707] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:26.958 [2024-07-16 00:06:13.763718] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x266c320 name raid_bdev1, state offline 00:11:26.958 0 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3492503 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3492503 ']' 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3492503 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3492503 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3492503' 00:11:26.958 killing process with pid 3492503 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3492503 00:11:26.958 [2024-07-16 00:06:13.844399] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:26.958 00:06:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3492503 00:11:26.958 [2024-07-16 00:06:13.855233] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:27.217 00:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.F2N0N3OhDi 00:11:27.217 00:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:27.217 00:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:27.217 00:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:11:27.217 00:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:27.217 00:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:27.217 00:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:27.217 00:06:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:11:27.217 00:11:27.217 real 0m8.021s 00:11:27.217 user 0m13.038s 00:11:27.217 sys 0m1.357s 00:11:27.217 00:06:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:27.217 00:06:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.217 ************************************ 00:11:27.217 END TEST raid_read_error_test 00:11:27.217 ************************************ 00:11:27.217 00:06:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:27.217 00:06:14 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:11:27.217 00:06:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:27.217 00:06:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:27.217 00:06:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:27.476 ************************************ 00:11:27.476 START TEST raid_write_error_test 00:11:27.476 ************************************ 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.t1EhCPPCSz 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3493696 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3493696 /var/tmp/spdk-raid.sock 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3493696 ']' 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:27.476 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:27.476 00:06:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.476 [2024-07-16 00:06:14.257703] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:11:27.476 [2024-07-16 00:06:14.257768] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3493696 ] 00:11:27.476 [2024-07-16 00:06:14.386474] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:27.735 [2024-07-16 00:06:14.488063] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:27.735 [2024-07-16 00:06:14.550836] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:27.735 [2024-07-16 00:06:14.550874] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:28.301 00:06:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:28.301 00:06:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:28.301 00:06:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:28.301 00:06:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:28.559 BaseBdev1_malloc 00:11:28.559 00:06:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:28.817 true 00:11:28.817 00:06:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:29.076 [2024-07-16 00:06:15.910737] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:29.076 [2024-07-16 00:06:15.910781] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:29.076 [2024-07-16 00:06:15.910802] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a350d0 00:11:29.076 [2024-07-16 00:06:15.910820] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:29.076 [2024-07-16 00:06:15.912683] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:29.076 [2024-07-16 00:06:15.912713] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:29.076 BaseBdev1 00:11:29.076 00:06:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:29.076 00:06:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:29.335 BaseBdev2_malloc 00:11:29.335 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:29.594 true 00:11:29.594 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:29.853 [2024-07-16 00:06:16.649488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:29.853 [2024-07-16 00:06:16.649531] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:29.853 [2024-07-16 00:06:16.649553] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a39910 00:11:29.853 [2024-07-16 00:06:16.649566] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:29.853 [2024-07-16 00:06:16.651167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:29.853 [2024-07-16 00:06:16.651195] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:29.853 BaseBdev2 00:11:29.853 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:30.112 [2024-07-16 00:06:16.882136] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:30.112 [2024-07-16 00:06:16.883478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:30.112 [2024-07-16 00:06:16.883673] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a3b320 00:11:30.112 [2024-07-16 00:06:16.883686] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:30.112 [2024-07-16 00:06:16.883880] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a3a270 00:11:30.112 [2024-07-16 00:06:16.884032] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a3b320 00:11:30.112 [2024-07-16 00:06:16.884043] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a3b320 00:11:30.112 [2024-07-16 00:06:16.884151] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.112 00:06:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:30.370 00:06:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:30.370 "name": "raid_bdev1", 00:11:30.370 "uuid": "79af4094-cf11-4aa7-b2b5-8bb259a6717c", 00:11:30.370 "strip_size_kb": 64, 00:11:30.370 "state": "online", 00:11:30.370 "raid_level": "raid0", 00:11:30.370 "superblock": true, 00:11:30.370 "num_base_bdevs": 2, 00:11:30.370 "num_base_bdevs_discovered": 2, 00:11:30.370 "num_base_bdevs_operational": 2, 00:11:30.370 "base_bdevs_list": [ 00:11:30.370 { 00:11:30.370 "name": "BaseBdev1", 00:11:30.370 "uuid": "1464d847-1274-566a-814e-8236c8ab2342", 00:11:30.370 "is_configured": true, 00:11:30.370 "data_offset": 2048, 00:11:30.370 "data_size": 63488 00:11:30.370 }, 00:11:30.370 { 00:11:30.370 "name": "BaseBdev2", 00:11:30.370 "uuid": "2df38f78-a966-5dd0-99a9-353696a05b56", 00:11:30.370 "is_configured": true, 00:11:30.370 "data_offset": 2048, 00:11:30.370 "data_size": 63488 00:11:30.370 } 00:11:30.370 ] 00:11:30.370 }' 00:11:30.370 00:06:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:30.370 00:06:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.938 00:06:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:30.938 00:06:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:30.938 [2024-07-16 00:06:17.853177] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a369b0 00:11:31.875 00:06:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.134 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:32.393 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.393 "name": "raid_bdev1", 00:11:32.393 "uuid": "79af4094-cf11-4aa7-b2b5-8bb259a6717c", 00:11:32.393 "strip_size_kb": 64, 00:11:32.393 "state": "online", 00:11:32.393 "raid_level": "raid0", 00:11:32.393 "superblock": true, 00:11:32.393 "num_base_bdevs": 2, 00:11:32.393 "num_base_bdevs_discovered": 2, 00:11:32.393 "num_base_bdevs_operational": 2, 00:11:32.393 "base_bdevs_list": [ 00:11:32.393 { 00:11:32.393 "name": "BaseBdev1", 00:11:32.393 "uuid": "1464d847-1274-566a-814e-8236c8ab2342", 00:11:32.393 "is_configured": true, 00:11:32.393 "data_offset": 2048, 00:11:32.393 "data_size": 63488 00:11:32.393 }, 00:11:32.393 { 00:11:32.393 "name": "BaseBdev2", 00:11:32.393 "uuid": "2df38f78-a966-5dd0-99a9-353696a05b56", 00:11:32.393 "is_configured": true, 00:11:32.393 "data_offset": 2048, 00:11:32.393 "data_size": 63488 00:11:32.393 } 00:11:32.393 ] 00:11:32.393 }' 00:11:32.393 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.393 00:06:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.960 00:06:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:33.218 [2024-07-16 00:06:20.081497] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:33.218 [2024-07-16 00:06:20.081531] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:33.218 [2024-07-16 00:06:20.084694] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:33.218 [2024-07-16 00:06:20.084725] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:33.218 [2024-07-16 00:06:20.084752] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:33.218 [2024-07-16 00:06:20.084764] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a3b320 name raid_bdev1, state offline 00:11:33.218 0 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3493696 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3493696 ']' 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3493696 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3493696 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3493696' 00:11:33.218 killing process with pid 3493696 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3493696 00:11:33.218 [2024-07-16 00:06:20.167386] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:33.218 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3493696 00:11:33.477 [2024-07-16 00:06:20.178133] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:33.477 00:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:33.477 00:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.t1EhCPPCSz 00:11:33.477 00:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:33.477 00:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:11:33.477 00:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:33.477 00:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:33.477 00:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:33.477 00:06:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:11:33.477 00:11:33.477 real 0m6.232s 00:11:33.477 user 0m9.743s 00:11:33.477 sys 0m1.093s 00:11:33.477 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:33.477 00:06:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.477 ************************************ 00:11:33.477 END TEST raid_write_error_test 00:11:33.477 ************************************ 00:11:33.737 00:06:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:33.737 00:06:20 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:33.737 00:06:20 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:11:33.737 00:06:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:33.737 00:06:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:33.737 00:06:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:33.737 ************************************ 00:11:33.737 START TEST raid_state_function_test 00:11:33.737 ************************************ 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3494657 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3494657' 00:11:33.737 Process raid pid: 3494657 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3494657 /var/tmp/spdk-raid.sock 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3494657 ']' 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:33.737 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:33.737 00:06:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.737 [2024-07-16 00:06:20.572985] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:11:33.737 [2024-07-16 00:06:20.573055] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:33.996 [2024-07-16 00:06:20.703000] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:33.996 [2024-07-16 00:06:20.807271] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:33.996 [2024-07-16 00:06:20.863304] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:33.996 [2024-07-16 00:06:20.863338] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:34.255 00:06:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:34.255 00:06:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:34.255 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:34.515 [2024-07-16 00:06:21.257759] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:34.515 [2024-07-16 00:06:21.257800] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:34.515 [2024-07-16 00:06:21.257811] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:34.515 [2024-07-16 00:06:21.257822] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.515 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:34.774 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:34.774 "name": "Existed_Raid", 00:11:34.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.774 "strip_size_kb": 64, 00:11:34.774 "state": "configuring", 00:11:34.774 "raid_level": "concat", 00:11:34.774 "superblock": false, 00:11:34.774 "num_base_bdevs": 2, 00:11:34.774 "num_base_bdevs_discovered": 0, 00:11:34.774 "num_base_bdevs_operational": 2, 00:11:34.774 "base_bdevs_list": [ 00:11:34.774 { 00:11:34.774 "name": "BaseBdev1", 00:11:34.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.774 "is_configured": false, 00:11:34.774 "data_offset": 0, 00:11:34.774 "data_size": 0 00:11:34.774 }, 00:11:34.774 { 00:11:34.774 "name": "BaseBdev2", 00:11:34.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.774 "is_configured": false, 00:11:34.774 "data_offset": 0, 00:11:34.774 "data_size": 0 00:11:34.774 } 00:11:34.774 ] 00:11:34.774 }' 00:11:34.774 00:06:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:34.774 00:06:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.341 00:06:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:35.600 [2024-07-16 00:06:22.340473] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:35.600 [2024-07-16 00:06:22.340500] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14efa80 name Existed_Raid, state configuring 00:11:35.600 00:06:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:35.859 [2024-07-16 00:06:22.589146] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:35.859 [2024-07-16 00:06:22.589182] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:35.859 [2024-07-16 00:06:22.589192] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:35.859 [2024-07-16 00:06:22.589203] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:35.859 00:06:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:36.118 [2024-07-16 00:06:22.847699] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:36.118 BaseBdev1 00:11:36.118 00:06:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:36.118 00:06:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:36.118 00:06:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:36.118 00:06:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:36.118 00:06:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:36.118 00:06:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:36.118 00:06:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:36.377 00:06:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:36.636 [ 00:11:36.636 { 00:11:36.636 "name": "BaseBdev1", 00:11:36.636 "aliases": [ 00:11:36.636 "06202aab-37b6-4563-af35-f4135efaf9dd" 00:11:36.636 ], 00:11:36.636 "product_name": "Malloc disk", 00:11:36.636 "block_size": 512, 00:11:36.636 "num_blocks": 65536, 00:11:36.636 "uuid": "06202aab-37b6-4563-af35-f4135efaf9dd", 00:11:36.636 "assigned_rate_limits": { 00:11:36.636 "rw_ios_per_sec": 0, 00:11:36.636 "rw_mbytes_per_sec": 0, 00:11:36.636 "r_mbytes_per_sec": 0, 00:11:36.636 "w_mbytes_per_sec": 0 00:11:36.636 }, 00:11:36.636 "claimed": true, 00:11:36.636 "claim_type": "exclusive_write", 00:11:36.636 "zoned": false, 00:11:36.636 "supported_io_types": { 00:11:36.636 "read": true, 00:11:36.636 "write": true, 00:11:36.636 "unmap": true, 00:11:36.636 "flush": true, 00:11:36.636 "reset": true, 00:11:36.636 "nvme_admin": false, 00:11:36.636 "nvme_io": false, 00:11:36.636 "nvme_io_md": false, 00:11:36.636 "write_zeroes": true, 00:11:36.636 "zcopy": true, 00:11:36.636 "get_zone_info": false, 00:11:36.636 "zone_management": false, 00:11:36.636 "zone_append": false, 00:11:36.636 "compare": false, 00:11:36.636 "compare_and_write": false, 00:11:36.636 "abort": true, 00:11:36.636 "seek_hole": false, 00:11:36.636 "seek_data": false, 00:11:36.636 "copy": true, 00:11:36.636 "nvme_iov_md": false 00:11:36.636 }, 00:11:36.636 "memory_domains": [ 00:11:36.636 { 00:11:36.636 "dma_device_id": "system", 00:11:36.636 "dma_device_type": 1 00:11:36.636 }, 00:11:36.636 { 00:11:36.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:36.636 "dma_device_type": 2 00:11:36.636 } 00:11:36.636 ], 00:11:36.636 "driver_specific": {} 00:11:36.636 } 00:11:36.636 ] 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.636 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:36.896 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:36.896 "name": "Existed_Raid", 00:11:36.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:36.896 "strip_size_kb": 64, 00:11:36.896 "state": "configuring", 00:11:36.896 "raid_level": "concat", 00:11:36.896 "superblock": false, 00:11:36.896 "num_base_bdevs": 2, 00:11:36.896 "num_base_bdevs_discovered": 1, 00:11:36.896 "num_base_bdevs_operational": 2, 00:11:36.896 "base_bdevs_list": [ 00:11:36.896 { 00:11:36.896 "name": "BaseBdev1", 00:11:36.896 "uuid": "06202aab-37b6-4563-af35-f4135efaf9dd", 00:11:36.896 "is_configured": true, 00:11:36.896 "data_offset": 0, 00:11:36.896 "data_size": 65536 00:11:36.896 }, 00:11:36.896 { 00:11:36.896 "name": "BaseBdev2", 00:11:36.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:36.896 "is_configured": false, 00:11:36.896 "data_offset": 0, 00:11:36.896 "data_size": 0 00:11:36.896 } 00:11:36.896 ] 00:11:36.896 }' 00:11:36.896 00:06:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:36.896 00:06:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:37.463 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:37.463 [2024-07-16 00:06:24.379784] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:37.463 [2024-07-16 00:06:24.379821] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14ef350 name Existed_Raid, state configuring 00:11:37.463 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:37.722 [2024-07-16 00:06:24.556359] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:37.722 [2024-07-16 00:06:24.557840] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:37.722 [2024-07-16 00:06:24.557872] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.722 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.981 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.981 "name": "Existed_Raid", 00:11:37.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.981 "strip_size_kb": 64, 00:11:37.981 "state": "configuring", 00:11:37.981 "raid_level": "concat", 00:11:37.981 "superblock": false, 00:11:37.981 "num_base_bdevs": 2, 00:11:37.981 "num_base_bdevs_discovered": 1, 00:11:37.981 "num_base_bdevs_operational": 2, 00:11:37.981 "base_bdevs_list": [ 00:11:37.981 { 00:11:37.981 "name": "BaseBdev1", 00:11:37.981 "uuid": "06202aab-37b6-4563-af35-f4135efaf9dd", 00:11:37.981 "is_configured": true, 00:11:37.981 "data_offset": 0, 00:11:37.981 "data_size": 65536 00:11:37.981 }, 00:11:37.981 { 00:11:37.981 "name": "BaseBdev2", 00:11:37.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.981 "is_configured": false, 00:11:37.981 "data_offset": 0, 00:11:37.981 "data_size": 0 00:11:37.981 } 00:11:37.981 ] 00:11:37.981 }' 00:11:37.981 00:06:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.981 00:06:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.548 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:38.548 [2024-07-16 00:06:25.494221] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:38.548 [2024-07-16 00:06:25.494254] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14f0000 00:11:38.548 [2024-07-16 00:06:25.494262] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:38.548 [2024-07-16 00:06:25.494448] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x140a0c0 00:11:38.548 [2024-07-16 00:06:25.494572] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14f0000 00:11:38.548 [2024-07-16 00:06:25.494582] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x14f0000 00:11:38.548 [2024-07-16 00:06:25.494738] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:38.548 BaseBdev2 00:11:38.807 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:38.807 00:06:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:38.807 00:06:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:38.807 00:06:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:38.807 00:06:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:38.807 00:06:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:38.807 00:06:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:38.807 00:06:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:39.066 [ 00:11:39.066 { 00:11:39.066 "name": "BaseBdev2", 00:11:39.066 "aliases": [ 00:11:39.066 "1838743b-e4ea-452f-9752-82f29fd4ce60" 00:11:39.066 ], 00:11:39.066 "product_name": "Malloc disk", 00:11:39.066 "block_size": 512, 00:11:39.066 "num_blocks": 65536, 00:11:39.066 "uuid": "1838743b-e4ea-452f-9752-82f29fd4ce60", 00:11:39.066 "assigned_rate_limits": { 00:11:39.066 "rw_ios_per_sec": 0, 00:11:39.066 "rw_mbytes_per_sec": 0, 00:11:39.066 "r_mbytes_per_sec": 0, 00:11:39.066 "w_mbytes_per_sec": 0 00:11:39.066 }, 00:11:39.066 "claimed": true, 00:11:39.066 "claim_type": "exclusive_write", 00:11:39.066 "zoned": false, 00:11:39.066 "supported_io_types": { 00:11:39.066 "read": true, 00:11:39.066 "write": true, 00:11:39.066 "unmap": true, 00:11:39.066 "flush": true, 00:11:39.066 "reset": true, 00:11:39.066 "nvme_admin": false, 00:11:39.066 "nvme_io": false, 00:11:39.066 "nvme_io_md": false, 00:11:39.066 "write_zeroes": true, 00:11:39.066 "zcopy": true, 00:11:39.066 "get_zone_info": false, 00:11:39.066 "zone_management": false, 00:11:39.066 "zone_append": false, 00:11:39.066 "compare": false, 00:11:39.066 "compare_and_write": false, 00:11:39.066 "abort": true, 00:11:39.066 "seek_hole": false, 00:11:39.066 "seek_data": false, 00:11:39.066 "copy": true, 00:11:39.066 "nvme_iov_md": false 00:11:39.066 }, 00:11:39.066 "memory_domains": [ 00:11:39.066 { 00:11:39.066 "dma_device_id": "system", 00:11:39.066 "dma_device_type": 1 00:11:39.066 }, 00:11:39.066 { 00:11:39.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.066 "dma_device_type": 2 00:11:39.066 } 00:11:39.066 ], 00:11:39.066 "driver_specific": {} 00:11:39.066 } 00:11:39.066 ] 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.066 00:06:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:39.325 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.325 "name": "Existed_Raid", 00:11:39.325 "uuid": "c8b52286-37b9-4642-bf6e-124342a58680", 00:11:39.325 "strip_size_kb": 64, 00:11:39.325 "state": "online", 00:11:39.325 "raid_level": "concat", 00:11:39.325 "superblock": false, 00:11:39.325 "num_base_bdevs": 2, 00:11:39.325 "num_base_bdevs_discovered": 2, 00:11:39.325 "num_base_bdevs_operational": 2, 00:11:39.325 "base_bdevs_list": [ 00:11:39.325 { 00:11:39.325 "name": "BaseBdev1", 00:11:39.325 "uuid": "06202aab-37b6-4563-af35-f4135efaf9dd", 00:11:39.325 "is_configured": true, 00:11:39.325 "data_offset": 0, 00:11:39.325 "data_size": 65536 00:11:39.325 }, 00:11:39.325 { 00:11:39.325 "name": "BaseBdev2", 00:11:39.325 "uuid": "1838743b-e4ea-452f-9752-82f29fd4ce60", 00:11:39.325 "is_configured": true, 00:11:39.325 "data_offset": 0, 00:11:39.325 "data_size": 65536 00:11:39.325 } 00:11:39.325 ] 00:11:39.325 }' 00:11:39.325 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.325 00:06:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.892 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:39.892 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:39.892 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:39.892 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:39.892 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:39.892 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:39.892 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:39.892 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:40.150 [2024-07-16 00:06:26.886186] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:40.150 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:40.150 "name": "Existed_Raid", 00:11:40.151 "aliases": [ 00:11:40.151 "c8b52286-37b9-4642-bf6e-124342a58680" 00:11:40.151 ], 00:11:40.151 "product_name": "Raid Volume", 00:11:40.151 "block_size": 512, 00:11:40.151 "num_blocks": 131072, 00:11:40.151 "uuid": "c8b52286-37b9-4642-bf6e-124342a58680", 00:11:40.151 "assigned_rate_limits": { 00:11:40.151 "rw_ios_per_sec": 0, 00:11:40.151 "rw_mbytes_per_sec": 0, 00:11:40.151 "r_mbytes_per_sec": 0, 00:11:40.151 "w_mbytes_per_sec": 0 00:11:40.151 }, 00:11:40.151 "claimed": false, 00:11:40.151 "zoned": false, 00:11:40.151 "supported_io_types": { 00:11:40.151 "read": true, 00:11:40.151 "write": true, 00:11:40.151 "unmap": true, 00:11:40.151 "flush": true, 00:11:40.151 "reset": true, 00:11:40.151 "nvme_admin": false, 00:11:40.151 "nvme_io": false, 00:11:40.151 "nvme_io_md": false, 00:11:40.151 "write_zeroes": true, 00:11:40.151 "zcopy": false, 00:11:40.151 "get_zone_info": false, 00:11:40.151 "zone_management": false, 00:11:40.151 "zone_append": false, 00:11:40.151 "compare": false, 00:11:40.151 "compare_and_write": false, 00:11:40.151 "abort": false, 00:11:40.151 "seek_hole": false, 00:11:40.151 "seek_data": false, 00:11:40.151 "copy": false, 00:11:40.151 "nvme_iov_md": false 00:11:40.151 }, 00:11:40.151 "memory_domains": [ 00:11:40.151 { 00:11:40.151 "dma_device_id": "system", 00:11:40.151 "dma_device_type": 1 00:11:40.151 }, 00:11:40.151 { 00:11:40.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.151 "dma_device_type": 2 00:11:40.151 }, 00:11:40.151 { 00:11:40.151 "dma_device_id": "system", 00:11:40.151 "dma_device_type": 1 00:11:40.151 }, 00:11:40.151 { 00:11:40.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.151 "dma_device_type": 2 00:11:40.151 } 00:11:40.151 ], 00:11:40.151 "driver_specific": { 00:11:40.151 "raid": { 00:11:40.151 "uuid": "c8b52286-37b9-4642-bf6e-124342a58680", 00:11:40.151 "strip_size_kb": 64, 00:11:40.151 "state": "online", 00:11:40.151 "raid_level": "concat", 00:11:40.151 "superblock": false, 00:11:40.151 "num_base_bdevs": 2, 00:11:40.151 "num_base_bdevs_discovered": 2, 00:11:40.151 "num_base_bdevs_operational": 2, 00:11:40.151 "base_bdevs_list": [ 00:11:40.151 { 00:11:40.151 "name": "BaseBdev1", 00:11:40.151 "uuid": "06202aab-37b6-4563-af35-f4135efaf9dd", 00:11:40.151 "is_configured": true, 00:11:40.151 "data_offset": 0, 00:11:40.151 "data_size": 65536 00:11:40.151 }, 00:11:40.151 { 00:11:40.151 "name": "BaseBdev2", 00:11:40.151 "uuid": "1838743b-e4ea-452f-9752-82f29fd4ce60", 00:11:40.151 "is_configured": true, 00:11:40.151 "data_offset": 0, 00:11:40.151 "data_size": 65536 00:11:40.151 } 00:11:40.151 ] 00:11:40.151 } 00:11:40.151 } 00:11:40.151 }' 00:11:40.151 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:40.151 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:40.151 BaseBdev2' 00:11:40.151 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:40.151 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:40.151 00:06:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:40.754 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:40.754 "name": "BaseBdev1", 00:11:40.754 "aliases": [ 00:11:40.754 "06202aab-37b6-4563-af35-f4135efaf9dd" 00:11:40.754 ], 00:11:40.754 "product_name": "Malloc disk", 00:11:40.754 "block_size": 512, 00:11:40.754 "num_blocks": 65536, 00:11:40.754 "uuid": "06202aab-37b6-4563-af35-f4135efaf9dd", 00:11:40.754 "assigned_rate_limits": { 00:11:40.754 "rw_ios_per_sec": 0, 00:11:40.754 "rw_mbytes_per_sec": 0, 00:11:40.754 "r_mbytes_per_sec": 0, 00:11:40.754 "w_mbytes_per_sec": 0 00:11:40.754 }, 00:11:40.754 "claimed": true, 00:11:40.754 "claim_type": "exclusive_write", 00:11:40.754 "zoned": false, 00:11:40.754 "supported_io_types": { 00:11:40.754 "read": true, 00:11:40.754 "write": true, 00:11:40.754 "unmap": true, 00:11:40.754 "flush": true, 00:11:40.754 "reset": true, 00:11:40.754 "nvme_admin": false, 00:11:40.754 "nvme_io": false, 00:11:40.754 "nvme_io_md": false, 00:11:40.754 "write_zeroes": true, 00:11:40.754 "zcopy": true, 00:11:40.754 "get_zone_info": false, 00:11:40.754 "zone_management": false, 00:11:40.754 "zone_append": false, 00:11:40.754 "compare": false, 00:11:40.754 "compare_and_write": false, 00:11:40.754 "abort": true, 00:11:40.754 "seek_hole": false, 00:11:40.754 "seek_data": false, 00:11:40.754 "copy": true, 00:11:40.754 "nvme_iov_md": false 00:11:40.754 }, 00:11:40.754 "memory_domains": [ 00:11:40.754 { 00:11:40.754 "dma_device_id": "system", 00:11:40.754 "dma_device_type": 1 00:11:40.754 }, 00:11:40.754 { 00:11:40.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.754 "dma_device_type": 2 00:11:40.754 } 00:11:40.754 ], 00:11:40.754 "driver_specific": {} 00:11:40.754 }' 00:11:40.754 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:40.754 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:40.754 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:40.754 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:40.754 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:40.754 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:40.754 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:40.754 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:41.013 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:41.013 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:41.013 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:41.013 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:41.013 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:41.013 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:41.013 00:06:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:41.272 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:41.272 "name": "BaseBdev2", 00:11:41.272 "aliases": [ 00:11:41.272 "1838743b-e4ea-452f-9752-82f29fd4ce60" 00:11:41.272 ], 00:11:41.272 "product_name": "Malloc disk", 00:11:41.272 "block_size": 512, 00:11:41.272 "num_blocks": 65536, 00:11:41.272 "uuid": "1838743b-e4ea-452f-9752-82f29fd4ce60", 00:11:41.272 "assigned_rate_limits": { 00:11:41.272 "rw_ios_per_sec": 0, 00:11:41.272 "rw_mbytes_per_sec": 0, 00:11:41.272 "r_mbytes_per_sec": 0, 00:11:41.272 "w_mbytes_per_sec": 0 00:11:41.272 }, 00:11:41.272 "claimed": true, 00:11:41.272 "claim_type": "exclusive_write", 00:11:41.272 "zoned": false, 00:11:41.272 "supported_io_types": { 00:11:41.272 "read": true, 00:11:41.272 "write": true, 00:11:41.272 "unmap": true, 00:11:41.272 "flush": true, 00:11:41.272 "reset": true, 00:11:41.272 "nvme_admin": false, 00:11:41.272 "nvme_io": false, 00:11:41.272 "nvme_io_md": false, 00:11:41.272 "write_zeroes": true, 00:11:41.272 "zcopy": true, 00:11:41.272 "get_zone_info": false, 00:11:41.272 "zone_management": false, 00:11:41.272 "zone_append": false, 00:11:41.272 "compare": false, 00:11:41.272 "compare_and_write": false, 00:11:41.272 "abort": true, 00:11:41.272 "seek_hole": false, 00:11:41.272 "seek_data": false, 00:11:41.272 "copy": true, 00:11:41.272 "nvme_iov_md": false 00:11:41.272 }, 00:11:41.272 "memory_domains": [ 00:11:41.272 { 00:11:41.272 "dma_device_id": "system", 00:11:41.272 "dma_device_type": 1 00:11:41.272 }, 00:11:41.272 { 00:11:41.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.272 "dma_device_type": 2 00:11:41.272 } 00:11:41.272 ], 00:11:41.272 "driver_specific": {} 00:11:41.272 }' 00:11:41.272 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:41.272 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:41.272 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:41.272 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:41.272 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:41.530 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:41.530 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:41.530 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:41.530 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:41.530 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:41.530 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:41.530 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:41.530 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:41.788 [2024-07-16 00:06:28.642612] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:41.788 [2024-07-16 00:06:28.642637] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:41.788 [2024-07-16 00:06:28.642676] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.788 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:42.046 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:42.046 "name": "Existed_Raid", 00:11:42.046 "uuid": "c8b52286-37b9-4642-bf6e-124342a58680", 00:11:42.046 "strip_size_kb": 64, 00:11:42.046 "state": "offline", 00:11:42.046 "raid_level": "concat", 00:11:42.046 "superblock": false, 00:11:42.046 "num_base_bdevs": 2, 00:11:42.046 "num_base_bdevs_discovered": 1, 00:11:42.046 "num_base_bdevs_operational": 1, 00:11:42.046 "base_bdevs_list": [ 00:11:42.046 { 00:11:42.046 "name": null, 00:11:42.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:42.046 "is_configured": false, 00:11:42.046 "data_offset": 0, 00:11:42.046 "data_size": 65536 00:11:42.046 }, 00:11:42.046 { 00:11:42.046 "name": "BaseBdev2", 00:11:42.046 "uuid": "1838743b-e4ea-452f-9752-82f29fd4ce60", 00:11:42.046 "is_configured": true, 00:11:42.046 "data_offset": 0, 00:11:42.046 "data_size": 65536 00:11:42.046 } 00:11:42.047 ] 00:11:42.047 }' 00:11:42.047 00:06:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:42.047 00:06:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:42.619 00:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:42.619 00:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:42.619 00:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.619 00:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:42.878 00:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:42.878 00:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:42.878 00:06:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:43.136 [2024-07-16 00:06:29.991168] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:43.136 [2024-07-16 00:06:29.991216] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14f0000 name Existed_Raid, state offline 00:11:43.136 00:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:43.136 00:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:43.136 00:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.136 00:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3494657 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3494657 ']' 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3494657 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3494657 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3494657' 00:11:43.395 killing process with pid 3494657 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3494657 00:11:43.395 [2024-07-16 00:06:30.329114] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:43.395 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3494657 00:11:43.395 [2024-07-16 00:06:30.330031] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:43.654 00:06:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:43.654 00:11:43.654 real 0m10.048s 00:11:43.654 user 0m18.226s 00:11:43.654 sys 0m1.961s 00:11:43.654 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:43.654 00:06:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.654 ************************************ 00:11:43.654 END TEST raid_state_function_test 00:11:43.654 ************************************ 00:11:43.655 00:06:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:43.655 00:06:30 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:11:43.655 00:06:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:43.655 00:06:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:43.655 00:06:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:43.914 ************************************ 00:11:43.914 START TEST raid_state_function_test_sb 00:11:43.914 ************************************ 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:43.914 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3496130 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3496130' 00:11:43.915 Process raid pid: 3496130 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3496130 /var/tmp/spdk-raid.sock 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3496130 ']' 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:43.915 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:43.915 00:06:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:43.915 [2024-07-16 00:06:30.715052] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:11:43.915 [2024-07-16 00:06:30.715124] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:43.915 [2024-07-16 00:06:30.839949] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:44.174 [2024-07-16 00:06:30.944513] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:44.174 [2024-07-16 00:06:30.997695] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:44.174 [2024-07-16 00:06:30.997720] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:44.743 00:06:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:44.743 00:06:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:44.743 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:45.002 [2024-07-16 00:06:31.860987] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:45.002 [2024-07-16 00:06:31.861030] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:45.002 [2024-07-16 00:06:31.861041] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:45.002 [2024-07-16 00:06:31.861053] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.002 00:06:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:45.273 00:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:45.273 "name": "Existed_Raid", 00:11:45.273 "uuid": "e1f67171-55c7-4e47-b3df-130cc4fd3c2f", 00:11:45.273 "strip_size_kb": 64, 00:11:45.273 "state": "configuring", 00:11:45.273 "raid_level": "concat", 00:11:45.273 "superblock": true, 00:11:45.273 "num_base_bdevs": 2, 00:11:45.273 "num_base_bdevs_discovered": 0, 00:11:45.273 "num_base_bdevs_operational": 2, 00:11:45.273 "base_bdevs_list": [ 00:11:45.273 { 00:11:45.273 "name": "BaseBdev1", 00:11:45.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.273 "is_configured": false, 00:11:45.273 "data_offset": 0, 00:11:45.273 "data_size": 0 00:11:45.273 }, 00:11:45.273 { 00:11:45.273 "name": "BaseBdev2", 00:11:45.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.273 "is_configured": false, 00:11:45.273 "data_offset": 0, 00:11:45.273 "data_size": 0 00:11:45.273 } 00:11:45.273 ] 00:11:45.273 }' 00:11:45.273 00:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:45.273 00:06:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:45.841 00:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:46.101 [2024-07-16 00:06:32.931668] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:46.101 [2024-07-16 00:06:32.931699] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e0a80 name Existed_Raid, state configuring 00:11:46.101 00:06:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:46.360 [2024-07-16 00:06:33.180348] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:46.360 [2024-07-16 00:06:33.180373] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:46.360 [2024-07-16 00:06:33.180383] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:46.360 [2024-07-16 00:06:33.180394] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:46.360 00:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:46.618 [2024-07-16 00:06:33.434800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:46.618 BaseBdev1 00:11:46.618 00:06:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:46.618 00:06:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:46.618 00:06:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:46.618 00:06:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:46.618 00:06:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:46.618 00:06:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:46.618 00:06:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:46.876 00:06:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:47.444 [ 00:11:47.444 { 00:11:47.444 "name": "BaseBdev1", 00:11:47.444 "aliases": [ 00:11:47.444 "deab9383-72e2-4347-8855-7d3ae4f29aca" 00:11:47.444 ], 00:11:47.444 "product_name": "Malloc disk", 00:11:47.444 "block_size": 512, 00:11:47.444 "num_blocks": 65536, 00:11:47.444 "uuid": "deab9383-72e2-4347-8855-7d3ae4f29aca", 00:11:47.444 "assigned_rate_limits": { 00:11:47.444 "rw_ios_per_sec": 0, 00:11:47.444 "rw_mbytes_per_sec": 0, 00:11:47.444 "r_mbytes_per_sec": 0, 00:11:47.444 "w_mbytes_per_sec": 0 00:11:47.444 }, 00:11:47.444 "claimed": true, 00:11:47.444 "claim_type": "exclusive_write", 00:11:47.444 "zoned": false, 00:11:47.444 "supported_io_types": { 00:11:47.444 "read": true, 00:11:47.444 "write": true, 00:11:47.444 "unmap": true, 00:11:47.444 "flush": true, 00:11:47.444 "reset": true, 00:11:47.444 "nvme_admin": false, 00:11:47.444 "nvme_io": false, 00:11:47.444 "nvme_io_md": false, 00:11:47.444 "write_zeroes": true, 00:11:47.444 "zcopy": true, 00:11:47.444 "get_zone_info": false, 00:11:47.444 "zone_management": false, 00:11:47.444 "zone_append": false, 00:11:47.444 "compare": false, 00:11:47.444 "compare_and_write": false, 00:11:47.444 "abort": true, 00:11:47.444 "seek_hole": false, 00:11:47.444 "seek_data": false, 00:11:47.444 "copy": true, 00:11:47.444 "nvme_iov_md": false 00:11:47.444 }, 00:11:47.444 "memory_domains": [ 00:11:47.444 { 00:11:47.444 "dma_device_id": "system", 00:11:47.444 "dma_device_type": 1 00:11:47.444 }, 00:11:47.444 { 00:11:47.444 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.444 "dma_device_type": 2 00:11:47.444 } 00:11:47.444 ], 00:11:47.444 "driver_specific": {} 00:11:47.444 } 00:11:47.444 ] 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.444 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:47.704 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.704 "name": "Existed_Raid", 00:11:47.704 "uuid": "bb78d983-470e-4438-a156-07e0a55c5743", 00:11:47.704 "strip_size_kb": 64, 00:11:47.704 "state": "configuring", 00:11:47.704 "raid_level": "concat", 00:11:47.704 "superblock": true, 00:11:47.704 "num_base_bdevs": 2, 00:11:47.704 "num_base_bdevs_discovered": 1, 00:11:47.704 "num_base_bdevs_operational": 2, 00:11:47.704 "base_bdevs_list": [ 00:11:47.704 { 00:11:47.704 "name": "BaseBdev1", 00:11:47.704 "uuid": "deab9383-72e2-4347-8855-7d3ae4f29aca", 00:11:47.704 "is_configured": true, 00:11:47.704 "data_offset": 2048, 00:11:47.704 "data_size": 63488 00:11:47.704 }, 00:11:47.704 { 00:11:47.704 "name": "BaseBdev2", 00:11:47.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.704 "is_configured": false, 00:11:47.704 "data_offset": 0, 00:11:47.704 "data_size": 0 00:11:47.704 } 00:11:47.704 ] 00:11:47.704 }' 00:11:47.704 00:06:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.704 00:06:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:48.271 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:48.530 [2024-07-16 00:06:35.247637] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:48.530 [2024-07-16 00:06:35.247689] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e0350 name Existed_Raid, state configuring 00:11:48.530 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:48.789 [2024-07-16 00:06:35.500339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:48.789 [2024-07-16 00:06:35.502073] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:48.789 [2024-07-16 00:06:35.502108] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:48.789 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.790 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.048 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.048 "name": "Existed_Raid", 00:11:49.048 "uuid": "e5abce99-3671-4f94-9912-18a159c49e58", 00:11:49.048 "strip_size_kb": 64, 00:11:49.048 "state": "configuring", 00:11:49.048 "raid_level": "concat", 00:11:49.048 "superblock": true, 00:11:49.048 "num_base_bdevs": 2, 00:11:49.048 "num_base_bdevs_discovered": 1, 00:11:49.048 "num_base_bdevs_operational": 2, 00:11:49.048 "base_bdevs_list": [ 00:11:49.048 { 00:11:49.048 "name": "BaseBdev1", 00:11:49.048 "uuid": "deab9383-72e2-4347-8855-7d3ae4f29aca", 00:11:49.048 "is_configured": true, 00:11:49.048 "data_offset": 2048, 00:11:49.048 "data_size": 63488 00:11:49.048 }, 00:11:49.048 { 00:11:49.048 "name": "BaseBdev2", 00:11:49.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.048 "is_configured": false, 00:11:49.048 "data_offset": 0, 00:11:49.048 "data_size": 0 00:11:49.048 } 00:11:49.048 ] 00:11:49.049 }' 00:11:49.049 00:06:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.049 00:06:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:49.617 00:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:49.876 [2024-07-16 00:06:36.652683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:49.876 [2024-07-16 00:06:36.652853] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e1000 00:11:49.876 [2024-07-16 00:06:36.652868] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:49.876 [2024-07-16 00:06:36.653060] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17fb0c0 00:11:49.876 [2024-07-16 00:06:36.653189] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e1000 00:11:49.876 [2024-07-16 00:06:36.653200] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18e1000 00:11:49.876 [2024-07-16 00:06:36.653298] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:49.876 BaseBdev2 00:11:49.876 00:06:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:49.876 00:06:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:49.876 00:06:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:49.876 00:06:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:49.876 00:06:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:49.876 00:06:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:49.876 00:06:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:50.134 00:06:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:50.393 [ 00:11:50.393 { 00:11:50.393 "name": "BaseBdev2", 00:11:50.393 "aliases": [ 00:11:50.393 "07544f7b-a928-4522-ae3a-f47f57fa6bde" 00:11:50.393 ], 00:11:50.393 "product_name": "Malloc disk", 00:11:50.393 "block_size": 512, 00:11:50.393 "num_blocks": 65536, 00:11:50.393 "uuid": "07544f7b-a928-4522-ae3a-f47f57fa6bde", 00:11:50.393 "assigned_rate_limits": { 00:11:50.393 "rw_ios_per_sec": 0, 00:11:50.393 "rw_mbytes_per_sec": 0, 00:11:50.393 "r_mbytes_per_sec": 0, 00:11:50.393 "w_mbytes_per_sec": 0 00:11:50.393 }, 00:11:50.393 "claimed": true, 00:11:50.393 "claim_type": "exclusive_write", 00:11:50.393 "zoned": false, 00:11:50.393 "supported_io_types": { 00:11:50.393 "read": true, 00:11:50.393 "write": true, 00:11:50.393 "unmap": true, 00:11:50.393 "flush": true, 00:11:50.393 "reset": true, 00:11:50.393 "nvme_admin": false, 00:11:50.393 "nvme_io": false, 00:11:50.393 "nvme_io_md": false, 00:11:50.393 "write_zeroes": true, 00:11:50.393 "zcopy": true, 00:11:50.393 "get_zone_info": false, 00:11:50.393 "zone_management": false, 00:11:50.393 "zone_append": false, 00:11:50.393 "compare": false, 00:11:50.393 "compare_and_write": false, 00:11:50.393 "abort": true, 00:11:50.393 "seek_hole": false, 00:11:50.393 "seek_data": false, 00:11:50.393 "copy": true, 00:11:50.393 "nvme_iov_md": false 00:11:50.393 }, 00:11:50.393 "memory_domains": [ 00:11:50.393 { 00:11:50.393 "dma_device_id": "system", 00:11:50.393 "dma_device_type": 1 00:11:50.393 }, 00:11:50.393 { 00:11:50.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.393 "dma_device_type": 2 00:11:50.393 } 00:11:50.393 ], 00:11:50.393 "driver_specific": {} 00:11:50.393 } 00:11:50.393 ] 00:11:50.393 00:06:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:50.393 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:50.393 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:50.393 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:50.393 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:50.393 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:50.393 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:50.393 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:50.393 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.393 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.394 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.394 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.394 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.394 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:50.394 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.653 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.653 "name": "Existed_Raid", 00:11:50.653 "uuid": "e5abce99-3671-4f94-9912-18a159c49e58", 00:11:50.653 "strip_size_kb": 64, 00:11:50.653 "state": "online", 00:11:50.653 "raid_level": "concat", 00:11:50.653 "superblock": true, 00:11:50.653 "num_base_bdevs": 2, 00:11:50.653 "num_base_bdevs_discovered": 2, 00:11:50.653 "num_base_bdevs_operational": 2, 00:11:50.653 "base_bdevs_list": [ 00:11:50.653 { 00:11:50.653 "name": "BaseBdev1", 00:11:50.653 "uuid": "deab9383-72e2-4347-8855-7d3ae4f29aca", 00:11:50.653 "is_configured": true, 00:11:50.653 "data_offset": 2048, 00:11:50.653 "data_size": 63488 00:11:50.653 }, 00:11:50.653 { 00:11:50.653 "name": "BaseBdev2", 00:11:50.653 "uuid": "07544f7b-a928-4522-ae3a-f47f57fa6bde", 00:11:50.653 "is_configured": true, 00:11:50.653 "data_offset": 2048, 00:11:50.653 "data_size": 63488 00:11:50.653 } 00:11:50.653 ] 00:11:50.653 }' 00:11:50.653 00:06:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.653 00:06:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:51.222 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:51.222 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:51.222 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:51.222 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:51.222 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:51.222 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:51.222 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:51.222 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:51.481 [2024-07-16 00:06:38.273251] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:51.481 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:51.481 "name": "Existed_Raid", 00:11:51.481 "aliases": [ 00:11:51.481 "e5abce99-3671-4f94-9912-18a159c49e58" 00:11:51.481 ], 00:11:51.481 "product_name": "Raid Volume", 00:11:51.481 "block_size": 512, 00:11:51.481 "num_blocks": 126976, 00:11:51.481 "uuid": "e5abce99-3671-4f94-9912-18a159c49e58", 00:11:51.481 "assigned_rate_limits": { 00:11:51.481 "rw_ios_per_sec": 0, 00:11:51.481 "rw_mbytes_per_sec": 0, 00:11:51.481 "r_mbytes_per_sec": 0, 00:11:51.481 "w_mbytes_per_sec": 0 00:11:51.481 }, 00:11:51.481 "claimed": false, 00:11:51.481 "zoned": false, 00:11:51.481 "supported_io_types": { 00:11:51.481 "read": true, 00:11:51.481 "write": true, 00:11:51.481 "unmap": true, 00:11:51.481 "flush": true, 00:11:51.481 "reset": true, 00:11:51.481 "nvme_admin": false, 00:11:51.481 "nvme_io": false, 00:11:51.481 "nvme_io_md": false, 00:11:51.481 "write_zeroes": true, 00:11:51.481 "zcopy": false, 00:11:51.481 "get_zone_info": false, 00:11:51.481 "zone_management": false, 00:11:51.481 "zone_append": false, 00:11:51.481 "compare": false, 00:11:51.481 "compare_and_write": false, 00:11:51.481 "abort": false, 00:11:51.481 "seek_hole": false, 00:11:51.481 "seek_data": false, 00:11:51.481 "copy": false, 00:11:51.481 "nvme_iov_md": false 00:11:51.481 }, 00:11:51.481 "memory_domains": [ 00:11:51.481 { 00:11:51.481 "dma_device_id": "system", 00:11:51.481 "dma_device_type": 1 00:11:51.481 }, 00:11:51.481 { 00:11:51.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.481 "dma_device_type": 2 00:11:51.481 }, 00:11:51.481 { 00:11:51.481 "dma_device_id": "system", 00:11:51.481 "dma_device_type": 1 00:11:51.481 }, 00:11:51.481 { 00:11:51.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.481 "dma_device_type": 2 00:11:51.481 } 00:11:51.481 ], 00:11:51.481 "driver_specific": { 00:11:51.481 "raid": { 00:11:51.481 "uuid": "e5abce99-3671-4f94-9912-18a159c49e58", 00:11:51.481 "strip_size_kb": 64, 00:11:51.481 "state": "online", 00:11:51.481 "raid_level": "concat", 00:11:51.481 "superblock": true, 00:11:51.481 "num_base_bdevs": 2, 00:11:51.481 "num_base_bdevs_discovered": 2, 00:11:51.481 "num_base_bdevs_operational": 2, 00:11:51.481 "base_bdevs_list": [ 00:11:51.481 { 00:11:51.481 "name": "BaseBdev1", 00:11:51.481 "uuid": "deab9383-72e2-4347-8855-7d3ae4f29aca", 00:11:51.481 "is_configured": true, 00:11:51.481 "data_offset": 2048, 00:11:51.481 "data_size": 63488 00:11:51.481 }, 00:11:51.481 { 00:11:51.481 "name": "BaseBdev2", 00:11:51.481 "uuid": "07544f7b-a928-4522-ae3a-f47f57fa6bde", 00:11:51.481 "is_configured": true, 00:11:51.481 "data_offset": 2048, 00:11:51.481 "data_size": 63488 00:11:51.481 } 00:11:51.481 ] 00:11:51.481 } 00:11:51.481 } 00:11:51.481 }' 00:11:51.481 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:51.481 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:51.481 BaseBdev2' 00:11:51.481 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:51.481 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:51.482 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:51.741 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:51.741 "name": "BaseBdev1", 00:11:51.741 "aliases": [ 00:11:51.741 "deab9383-72e2-4347-8855-7d3ae4f29aca" 00:11:51.741 ], 00:11:51.741 "product_name": "Malloc disk", 00:11:51.741 "block_size": 512, 00:11:51.741 "num_blocks": 65536, 00:11:51.741 "uuid": "deab9383-72e2-4347-8855-7d3ae4f29aca", 00:11:51.741 "assigned_rate_limits": { 00:11:51.741 "rw_ios_per_sec": 0, 00:11:51.741 "rw_mbytes_per_sec": 0, 00:11:51.741 "r_mbytes_per_sec": 0, 00:11:51.741 "w_mbytes_per_sec": 0 00:11:51.741 }, 00:11:51.741 "claimed": true, 00:11:51.741 "claim_type": "exclusive_write", 00:11:51.741 "zoned": false, 00:11:51.741 "supported_io_types": { 00:11:51.741 "read": true, 00:11:51.741 "write": true, 00:11:51.741 "unmap": true, 00:11:51.741 "flush": true, 00:11:51.741 "reset": true, 00:11:51.741 "nvme_admin": false, 00:11:51.741 "nvme_io": false, 00:11:51.741 "nvme_io_md": false, 00:11:51.741 "write_zeroes": true, 00:11:51.741 "zcopy": true, 00:11:51.741 "get_zone_info": false, 00:11:51.741 "zone_management": false, 00:11:51.741 "zone_append": false, 00:11:51.741 "compare": false, 00:11:51.741 "compare_and_write": false, 00:11:51.741 "abort": true, 00:11:51.741 "seek_hole": false, 00:11:51.741 "seek_data": false, 00:11:51.741 "copy": true, 00:11:51.741 "nvme_iov_md": false 00:11:51.741 }, 00:11:51.741 "memory_domains": [ 00:11:51.741 { 00:11:51.741 "dma_device_id": "system", 00:11:51.741 "dma_device_type": 1 00:11:51.741 }, 00:11:51.741 { 00:11:51.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.741 "dma_device_type": 2 00:11:51.741 } 00:11:51.741 ], 00:11:51.741 "driver_specific": {} 00:11:51.741 }' 00:11:51.741 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.741 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.741 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:51.741 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.001 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.001 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:52.001 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.001 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.001 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:52.001 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.001 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.261 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:52.261 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:52.261 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:52.261 00:06:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:52.261 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:52.261 "name": "BaseBdev2", 00:11:52.261 "aliases": [ 00:11:52.261 "07544f7b-a928-4522-ae3a-f47f57fa6bde" 00:11:52.261 ], 00:11:52.261 "product_name": "Malloc disk", 00:11:52.261 "block_size": 512, 00:11:52.261 "num_blocks": 65536, 00:11:52.261 "uuid": "07544f7b-a928-4522-ae3a-f47f57fa6bde", 00:11:52.261 "assigned_rate_limits": { 00:11:52.261 "rw_ios_per_sec": 0, 00:11:52.261 "rw_mbytes_per_sec": 0, 00:11:52.261 "r_mbytes_per_sec": 0, 00:11:52.261 "w_mbytes_per_sec": 0 00:11:52.261 }, 00:11:52.261 "claimed": true, 00:11:52.261 "claim_type": "exclusive_write", 00:11:52.261 "zoned": false, 00:11:52.261 "supported_io_types": { 00:11:52.261 "read": true, 00:11:52.261 "write": true, 00:11:52.261 "unmap": true, 00:11:52.261 "flush": true, 00:11:52.261 "reset": true, 00:11:52.261 "nvme_admin": false, 00:11:52.261 "nvme_io": false, 00:11:52.261 "nvme_io_md": false, 00:11:52.261 "write_zeroes": true, 00:11:52.261 "zcopy": true, 00:11:52.261 "get_zone_info": false, 00:11:52.261 "zone_management": false, 00:11:52.261 "zone_append": false, 00:11:52.261 "compare": false, 00:11:52.261 "compare_and_write": false, 00:11:52.261 "abort": true, 00:11:52.261 "seek_hole": false, 00:11:52.261 "seek_data": false, 00:11:52.261 "copy": true, 00:11:52.261 "nvme_iov_md": false 00:11:52.261 }, 00:11:52.261 "memory_domains": [ 00:11:52.261 { 00:11:52.261 "dma_device_id": "system", 00:11:52.261 "dma_device_type": 1 00:11:52.261 }, 00:11:52.261 { 00:11:52.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.261 "dma_device_type": 2 00:11:52.261 } 00:11:52.261 ], 00:11:52.261 "driver_specific": {} 00:11:52.261 }' 00:11:52.520 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.520 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.520 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:52.520 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.520 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.520 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:52.520 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.780 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.780 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:52.780 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.780 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.780 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:52.780 00:06:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:53.348 [2024-07-16 00:06:40.109948] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:53.348 [2024-07-16 00:06:40.109986] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:53.348 [2024-07-16 00:06:40.110037] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.348 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:53.607 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.607 "name": "Existed_Raid", 00:11:53.607 "uuid": "e5abce99-3671-4f94-9912-18a159c49e58", 00:11:53.607 "strip_size_kb": 64, 00:11:53.607 "state": "offline", 00:11:53.607 "raid_level": "concat", 00:11:53.607 "superblock": true, 00:11:53.607 "num_base_bdevs": 2, 00:11:53.607 "num_base_bdevs_discovered": 1, 00:11:53.607 "num_base_bdevs_operational": 1, 00:11:53.607 "base_bdevs_list": [ 00:11:53.607 { 00:11:53.607 "name": null, 00:11:53.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.607 "is_configured": false, 00:11:53.607 "data_offset": 2048, 00:11:53.607 "data_size": 63488 00:11:53.607 }, 00:11:53.607 { 00:11:53.607 "name": "BaseBdev2", 00:11:53.607 "uuid": "07544f7b-a928-4522-ae3a-f47f57fa6bde", 00:11:53.607 "is_configured": true, 00:11:53.607 "data_offset": 2048, 00:11:53.607 "data_size": 63488 00:11:53.607 } 00:11:53.607 ] 00:11:53.607 }' 00:11:53.607 00:06:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.607 00:06:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:54.175 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:54.175 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:54.175 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.175 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:54.434 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:54.434 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:54.434 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:54.732 [2024-07-16 00:06:41.534094] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:54.732 [2024-07-16 00:06:41.534156] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e1000 name Existed_Raid, state offline 00:11:54.732 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:54.732 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:54.732 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.732 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3496130 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3496130 ']' 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3496130 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3496130 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3496130' 00:11:54.992 killing process with pid 3496130 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3496130 00:11:54.992 [2024-07-16 00:06:41.881431] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:54.992 00:06:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3496130 00:11:54.992 [2024-07-16 00:06:41.883232] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:55.561 00:06:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:55.561 00:11:55.561 real 0m11.645s 00:11:55.561 user 0m20.542s 00:11:55.561 sys 0m2.119s 00:11:55.561 00:06:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:55.561 00:06:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:55.561 ************************************ 00:11:55.561 END TEST raid_state_function_test_sb 00:11:55.561 ************************************ 00:11:55.561 00:06:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:55.561 00:06:42 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:11:55.561 00:06:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:55.561 00:06:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:55.561 00:06:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:55.561 ************************************ 00:11:55.561 START TEST raid_superblock_test 00:11:55.561 ************************************ 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3497944 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3497944 /var/tmp/spdk-raid.sock 00:11:55.561 00:06:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:55.562 00:06:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3497944 ']' 00:11:55.562 00:06:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:55.562 00:06:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:55.562 00:06:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:55.562 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:55.562 00:06:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:55.562 00:06:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.562 [2024-07-16 00:06:42.429951] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:11:55.562 [2024-07-16 00:06:42.429998] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3497944 ] 00:11:55.821 [2024-07-16 00:06:42.543123] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:55.821 [2024-07-16 00:06:42.648312] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:55.821 [2024-07-16 00:06:42.707779] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:55.821 [2024-07-16 00:06:42.707816] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:56.757 malloc1 00:11:56.757 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:57.016 [2024-07-16 00:06:43.851793] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:57.016 [2024-07-16 00:06:43.851843] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:57.016 [2024-07-16 00:06:43.851862] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b85570 00:11:57.016 [2024-07-16 00:06:43.851875] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:57.016 [2024-07-16 00:06:43.853442] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:57.016 [2024-07-16 00:06:43.853473] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:57.016 pt1 00:11:57.016 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:57.016 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:57.016 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:57.016 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:57.016 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:57.016 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:57.016 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:57.016 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:57.016 00:06:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:57.275 malloc2 00:11:57.275 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:57.536 [2024-07-16 00:06:44.353836] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:57.536 [2024-07-16 00:06:44.353879] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:57.536 [2024-07-16 00:06:44.353897] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b86970 00:11:57.536 [2024-07-16 00:06:44.353910] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:57.536 [2024-07-16 00:06:44.355362] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:57.536 [2024-07-16 00:06:44.355392] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:57.536 pt2 00:11:57.536 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:57.536 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:57.536 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:11:57.796 [2024-07-16 00:06:44.538356] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:57.796 [2024-07-16 00:06:44.539548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:57.796 [2024-07-16 00:06:44.539684] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d29270 00:11:57.796 [2024-07-16 00:06:44.539697] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:57.796 [2024-07-16 00:06:44.539880] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d1ec10 00:11:57.796 [2024-07-16 00:06:44.540027] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d29270 00:11:57.796 [2024-07-16 00:06:44.540038] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d29270 00:11:57.796 [2024-07-16 00:06:44.540131] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.796 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:58.054 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:58.054 "name": "raid_bdev1", 00:11:58.054 "uuid": "d57c2e80-8f14-4cfe-8746-357f2df4bad2", 00:11:58.054 "strip_size_kb": 64, 00:11:58.054 "state": "online", 00:11:58.054 "raid_level": "concat", 00:11:58.054 "superblock": true, 00:11:58.054 "num_base_bdevs": 2, 00:11:58.054 "num_base_bdevs_discovered": 2, 00:11:58.054 "num_base_bdevs_operational": 2, 00:11:58.054 "base_bdevs_list": [ 00:11:58.054 { 00:11:58.054 "name": "pt1", 00:11:58.054 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:58.054 "is_configured": true, 00:11:58.054 "data_offset": 2048, 00:11:58.054 "data_size": 63488 00:11:58.054 }, 00:11:58.054 { 00:11:58.054 "name": "pt2", 00:11:58.054 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:58.054 "is_configured": true, 00:11:58.054 "data_offset": 2048, 00:11:58.054 "data_size": 63488 00:11:58.054 } 00:11:58.054 ] 00:11:58.055 }' 00:11:58.055 00:06:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:58.055 00:06:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:58.621 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:58.621 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:58.621 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:58.621 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:58.621 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:58.621 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:58.621 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:58.621 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:58.880 [2024-07-16 00:06:45.609414] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:58.880 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:58.880 "name": "raid_bdev1", 00:11:58.880 "aliases": [ 00:11:58.880 "d57c2e80-8f14-4cfe-8746-357f2df4bad2" 00:11:58.880 ], 00:11:58.880 "product_name": "Raid Volume", 00:11:58.880 "block_size": 512, 00:11:58.880 "num_blocks": 126976, 00:11:58.880 "uuid": "d57c2e80-8f14-4cfe-8746-357f2df4bad2", 00:11:58.880 "assigned_rate_limits": { 00:11:58.880 "rw_ios_per_sec": 0, 00:11:58.880 "rw_mbytes_per_sec": 0, 00:11:58.880 "r_mbytes_per_sec": 0, 00:11:58.880 "w_mbytes_per_sec": 0 00:11:58.880 }, 00:11:58.880 "claimed": false, 00:11:58.880 "zoned": false, 00:11:58.880 "supported_io_types": { 00:11:58.880 "read": true, 00:11:58.880 "write": true, 00:11:58.880 "unmap": true, 00:11:58.880 "flush": true, 00:11:58.880 "reset": true, 00:11:58.880 "nvme_admin": false, 00:11:58.880 "nvme_io": false, 00:11:58.880 "nvme_io_md": false, 00:11:58.880 "write_zeroes": true, 00:11:58.880 "zcopy": false, 00:11:58.880 "get_zone_info": false, 00:11:58.880 "zone_management": false, 00:11:58.880 "zone_append": false, 00:11:58.880 "compare": false, 00:11:58.880 "compare_and_write": false, 00:11:58.880 "abort": false, 00:11:58.880 "seek_hole": false, 00:11:58.880 "seek_data": false, 00:11:58.880 "copy": false, 00:11:58.880 "nvme_iov_md": false 00:11:58.880 }, 00:11:58.880 "memory_domains": [ 00:11:58.880 { 00:11:58.880 "dma_device_id": "system", 00:11:58.880 "dma_device_type": 1 00:11:58.880 }, 00:11:58.880 { 00:11:58.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.880 "dma_device_type": 2 00:11:58.880 }, 00:11:58.880 { 00:11:58.880 "dma_device_id": "system", 00:11:58.880 "dma_device_type": 1 00:11:58.880 }, 00:11:58.880 { 00:11:58.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.880 "dma_device_type": 2 00:11:58.880 } 00:11:58.880 ], 00:11:58.880 "driver_specific": { 00:11:58.880 "raid": { 00:11:58.880 "uuid": "d57c2e80-8f14-4cfe-8746-357f2df4bad2", 00:11:58.880 "strip_size_kb": 64, 00:11:58.880 "state": "online", 00:11:58.880 "raid_level": "concat", 00:11:58.880 "superblock": true, 00:11:58.880 "num_base_bdevs": 2, 00:11:58.880 "num_base_bdevs_discovered": 2, 00:11:58.880 "num_base_bdevs_operational": 2, 00:11:58.880 "base_bdevs_list": [ 00:11:58.880 { 00:11:58.880 "name": "pt1", 00:11:58.880 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:58.880 "is_configured": true, 00:11:58.880 "data_offset": 2048, 00:11:58.880 "data_size": 63488 00:11:58.880 }, 00:11:58.880 { 00:11:58.880 "name": "pt2", 00:11:58.880 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:58.880 "is_configured": true, 00:11:58.880 "data_offset": 2048, 00:11:58.880 "data_size": 63488 00:11:58.880 } 00:11:58.880 ] 00:11:58.880 } 00:11:58.880 } 00:11:58.880 }' 00:11:58.880 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:58.880 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:58.880 pt2' 00:11:58.880 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:58.880 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:58.880 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:59.139 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:59.139 "name": "pt1", 00:11:59.139 "aliases": [ 00:11:59.139 "00000000-0000-0000-0000-000000000001" 00:11:59.139 ], 00:11:59.139 "product_name": "passthru", 00:11:59.139 "block_size": 512, 00:11:59.139 "num_blocks": 65536, 00:11:59.139 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:59.139 "assigned_rate_limits": { 00:11:59.139 "rw_ios_per_sec": 0, 00:11:59.139 "rw_mbytes_per_sec": 0, 00:11:59.139 "r_mbytes_per_sec": 0, 00:11:59.139 "w_mbytes_per_sec": 0 00:11:59.139 }, 00:11:59.139 "claimed": true, 00:11:59.139 "claim_type": "exclusive_write", 00:11:59.139 "zoned": false, 00:11:59.139 "supported_io_types": { 00:11:59.139 "read": true, 00:11:59.139 "write": true, 00:11:59.139 "unmap": true, 00:11:59.139 "flush": true, 00:11:59.139 "reset": true, 00:11:59.139 "nvme_admin": false, 00:11:59.139 "nvme_io": false, 00:11:59.139 "nvme_io_md": false, 00:11:59.139 "write_zeroes": true, 00:11:59.139 "zcopy": true, 00:11:59.139 "get_zone_info": false, 00:11:59.139 "zone_management": false, 00:11:59.139 "zone_append": false, 00:11:59.139 "compare": false, 00:11:59.139 "compare_and_write": false, 00:11:59.139 "abort": true, 00:11:59.139 "seek_hole": false, 00:11:59.139 "seek_data": false, 00:11:59.139 "copy": true, 00:11:59.139 "nvme_iov_md": false 00:11:59.139 }, 00:11:59.139 "memory_domains": [ 00:11:59.139 { 00:11:59.139 "dma_device_id": "system", 00:11:59.139 "dma_device_type": 1 00:11:59.139 }, 00:11:59.139 { 00:11:59.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.139 "dma_device_type": 2 00:11:59.139 } 00:11:59.139 ], 00:11:59.139 "driver_specific": { 00:11:59.139 "passthru": { 00:11:59.139 "name": "pt1", 00:11:59.139 "base_bdev_name": "malloc1" 00:11:59.139 } 00:11:59.139 } 00:11:59.139 }' 00:11:59.139 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:59.139 00:06:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:59.139 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:59.139 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:59.139 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:59.398 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:59.398 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:59.398 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:59.398 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:59.398 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:59.398 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:59.398 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:59.398 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:59.398 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:59.398 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:59.656 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:59.656 "name": "pt2", 00:11:59.656 "aliases": [ 00:11:59.656 "00000000-0000-0000-0000-000000000002" 00:11:59.656 ], 00:11:59.656 "product_name": "passthru", 00:11:59.656 "block_size": 512, 00:11:59.656 "num_blocks": 65536, 00:11:59.656 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:59.656 "assigned_rate_limits": { 00:11:59.656 "rw_ios_per_sec": 0, 00:11:59.656 "rw_mbytes_per_sec": 0, 00:11:59.656 "r_mbytes_per_sec": 0, 00:11:59.656 "w_mbytes_per_sec": 0 00:11:59.656 }, 00:11:59.656 "claimed": true, 00:11:59.656 "claim_type": "exclusive_write", 00:11:59.656 "zoned": false, 00:11:59.656 "supported_io_types": { 00:11:59.656 "read": true, 00:11:59.656 "write": true, 00:11:59.656 "unmap": true, 00:11:59.656 "flush": true, 00:11:59.656 "reset": true, 00:11:59.656 "nvme_admin": false, 00:11:59.656 "nvme_io": false, 00:11:59.656 "nvme_io_md": false, 00:11:59.656 "write_zeroes": true, 00:11:59.656 "zcopy": true, 00:11:59.656 "get_zone_info": false, 00:11:59.656 "zone_management": false, 00:11:59.656 "zone_append": false, 00:11:59.656 "compare": false, 00:11:59.656 "compare_and_write": false, 00:11:59.656 "abort": true, 00:11:59.656 "seek_hole": false, 00:11:59.656 "seek_data": false, 00:11:59.656 "copy": true, 00:11:59.656 "nvme_iov_md": false 00:11:59.656 }, 00:11:59.656 "memory_domains": [ 00:11:59.656 { 00:11:59.656 "dma_device_id": "system", 00:11:59.656 "dma_device_type": 1 00:11:59.656 }, 00:11:59.656 { 00:11:59.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.656 "dma_device_type": 2 00:11:59.656 } 00:11:59.656 ], 00:11:59.656 "driver_specific": { 00:11:59.656 "passthru": { 00:11:59.656 "name": "pt2", 00:11:59.656 "base_bdev_name": "malloc2" 00:11:59.656 } 00:11:59.656 } 00:11:59.656 }' 00:11:59.656 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:59.656 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:59.913 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:59.913 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:59.913 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:59.913 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:59.913 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:59.913 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:59.913 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:59.913 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:59.913 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.169 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:00.169 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:00.169 00:06:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:00.428 [2024-07-16 00:06:47.121399] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:00.428 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d57c2e80-8f14-4cfe-8746-357f2df4bad2 00:12:00.428 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d57c2e80-8f14-4cfe-8746-357f2df4bad2 ']' 00:12:00.428 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:00.685 [2024-07-16 00:06:47.381850] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:00.685 [2024-07-16 00:06:47.381871] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:00.685 [2024-07-16 00:06:47.381932] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:00.685 [2024-07-16 00:06:47.381975] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:00.685 [2024-07-16 00:06:47.381987] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d29270 name raid_bdev1, state offline 00:12:00.685 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.685 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:00.944 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:00.944 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:00.944 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:00.944 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:00.944 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:00.944 00:06:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:01.565 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:01.823 [2024-07-16 00:06:48.641153] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:01.823 [2024-07-16 00:06:48.642547] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:01.823 [2024-07-16 00:06:48.642607] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:01.823 [2024-07-16 00:06:48.642649] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:01.823 [2024-07-16 00:06:48.642668] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:01.823 [2024-07-16 00:06:48.642678] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d28ff0 name raid_bdev1, state configuring 00:12:01.823 request: 00:12:01.823 { 00:12:01.823 "name": "raid_bdev1", 00:12:01.823 "raid_level": "concat", 00:12:01.823 "base_bdevs": [ 00:12:01.823 "malloc1", 00:12:01.823 "malloc2" 00:12:01.823 ], 00:12:01.823 "strip_size_kb": 64, 00:12:01.823 "superblock": false, 00:12:01.823 "method": "bdev_raid_create", 00:12:01.823 "req_id": 1 00:12:01.823 } 00:12:01.823 Got JSON-RPC error response 00:12:01.823 response: 00:12:01.823 { 00:12:01.823 "code": -17, 00:12:01.823 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:01.823 } 00:12:01.823 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:01.824 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:01.824 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:01.824 00:06:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:01.824 00:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.824 00:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:02.081 00:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:02.081 00:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:02.081 00:06:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:02.340 [2024-07-16 00:06:49.070213] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:02.340 [2024-07-16 00:06:49.070251] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:02.340 [2024-07-16 00:06:49.070271] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b857a0 00:12:02.340 [2024-07-16 00:06:49.070284] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:02.340 [2024-07-16 00:06:49.071832] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:02.340 [2024-07-16 00:06:49.071861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:02.340 [2024-07-16 00:06:49.071924] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:02.340 [2024-07-16 00:06:49.071958] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:02.340 pt1 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.340 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:02.598 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.598 "name": "raid_bdev1", 00:12:02.598 "uuid": "d57c2e80-8f14-4cfe-8746-357f2df4bad2", 00:12:02.598 "strip_size_kb": 64, 00:12:02.598 "state": "configuring", 00:12:02.598 "raid_level": "concat", 00:12:02.598 "superblock": true, 00:12:02.598 "num_base_bdevs": 2, 00:12:02.598 "num_base_bdevs_discovered": 1, 00:12:02.598 "num_base_bdevs_operational": 2, 00:12:02.598 "base_bdevs_list": [ 00:12:02.598 { 00:12:02.598 "name": "pt1", 00:12:02.598 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:02.598 "is_configured": true, 00:12:02.598 "data_offset": 2048, 00:12:02.598 "data_size": 63488 00:12:02.598 }, 00:12:02.598 { 00:12:02.598 "name": null, 00:12:02.598 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:02.598 "is_configured": false, 00:12:02.598 "data_offset": 2048, 00:12:02.598 "data_size": 63488 00:12:02.598 } 00:12:02.598 ] 00:12:02.598 }' 00:12:02.598 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.598 00:06:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.165 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:03.165 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:03.165 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:03.165 00:06:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:03.423 [2024-07-16 00:06:50.217293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:03.423 [2024-07-16 00:06:50.217353] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:03.423 [2024-07-16 00:06:50.217373] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d1f820 00:12:03.423 [2024-07-16 00:06:50.217386] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:03.423 [2024-07-16 00:06:50.217755] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:03.423 [2024-07-16 00:06:50.217776] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:03.423 [2024-07-16 00:06:50.217847] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:03.423 [2024-07-16 00:06:50.217867] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:03.423 [2024-07-16 00:06:50.217980] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b7bec0 00:12:03.423 [2024-07-16 00:06:50.217991] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:03.423 [2024-07-16 00:06:50.218160] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b7cf00 00:12:03.423 [2024-07-16 00:06:50.218284] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b7bec0 00:12:03.423 [2024-07-16 00:06:50.218294] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b7bec0 00:12:03.423 [2024-07-16 00:06:50.218392] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:03.423 pt2 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:03.423 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:03.681 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:03.681 "name": "raid_bdev1", 00:12:03.681 "uuid": "d57c2e80-8f14-4cfe-8746-357f2df4bad2", 00:12:03.681 "strip_size_kb": 64, 00:12:03.681 "state": "online", 00:12:03.681 "raid_level": "concat", 00:12:03.681 "superblock": true, 00:12:03.681 "num_base_bdevs": 2, 00:12:03.681 "num_base_bdevs_discovered": 2, 00:12:03.681 "num_base_bdevs_operational": 2, 00:12:03.681 "base_bdevs_list": [ 00:12:03.681 { 00:12:03.681 "name": "pt1", 00:12:03.681 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:03.681 "is_configured": true, 00:12:03.681 "data_offset": 2048, 00:12:03.681 "data_size": 63488 00:12:03.681 }, 00:12:03.681 { 00:12:03.681 "name": "pt2", 00:12:03.681 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:03.681 "is_configured": true, 00:12:03.681 "data_offset": 2048, 00:12:03.681 "data_size": 63488 00:12:03.681 } 00:12:03.681 ] 00:12:03.681 }' 00:12:03.681 00:06:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:03.681 00:06:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:04.248 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:04.248 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:04.248 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:04.248 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:04.248 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:04.248 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:04.248 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:04.248 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:04.507 [2024-07-16 00:06:51.304409] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:04.507 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:04.507 "name": "raid_bdev1", 00:12:04.507 "aliases": [ 00:12:04.507 "d57c2e80-8f14-4cfe-8746-357f2df4bad2" 00:12:04.507 ], 00:12:04.507 "product_name": "Raid Volume", 00:12:04.507 "block_size": 512, 00:12:04.507 "num_blocks": 126976, 00:12:04.507 "uuid": "d57c2e80-8f14-4cfe-8746-357f2df4bad2", 00:12:04.507 "assigned_rate_limits": { 00:12:04.507 "rw_ios_per_sec": 0, 00:12:04.507 "rw_mbytes_per_sec": 0, 00:12:04.507 "r_mbytes_per_sec": 0, 00:12:04.507 "w_mbytes_per_sec": 0 00:12:04.507 }, 00:12:04.507 "claimed": false, 00:12:04.507 "zoned": false, 00:12:04.507 "supported_io_types": { 00:12:04.507 "read": true, 00:12:04.507 "write": true, 00:12:04.507 "unmap": true, 00:12:04.507 "flush": true, 00:12:04.507 "reset": true, 00:12:04.507 "nvme_admin": false, 00:12:04.507 "nvme_io": false, 00:12:04.507 "nvme_io_md": false, 00:12:04.507 "write_zeroes": true, 00:12:04.507 "zcopy": false, 00:12:04.507 "get_zone_info": false, 00:12:04.507 "zone_management": false, 00:12:04.507 "zone_append": false, 00:12:04.507 "compare": false, 00:12:04.507 "compare_and_write": false, 00:12:04.507 "abort": false, 00:12:04.507 "seek_hole": false, 00:12:04.507 "seek_data": false, 00:12:04.507 "copy": false, 00:12:04.507 "nvme_iov_md": false 00:12:04.507 }, 00:12:04.507 "memory_domains": [ 00:12:04.507 { 00:12:04.507 "dma_device_id": "system", 00:12:04.507 "dma_device_type": 1 00:12:04.507 }, 00:12:04.507 { 00:12:04.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.507 "dma_device_type": 2 00:12:04.507 }, 00:12:04.507 { 00:12:04.507 "dma_device_id": "system", 00:12:04.507 "dma_device_type": 1 00:12:04.507 }, 00:12:04.507 { 00:12:04.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.507 "dma_device_type": 2 00:12:04.507 } 00:12:04.507 ], 00:12:04.507 "driver_specific": { 00:12:04.507 "raid": { 00:12:04.507 "uuid": "d57c2e80-8f14-4cfe-8746-357f2df4bad2", 00:12:04.507 "strip_size_kb": 64, 00:12:04.507 "state": "online", 00:12:04.507 "raid_level": "concat", 00:12:04.507 "superblock": true, 00:12:04.507 "num_base_bdevs": 2, 00:12:04.507 "num_base_bdevs_discovered": 2, 00:12:04.507 "num_base_bdevs_operational": 2, 00:12:04.507 "base_bdevs_list": [ 00:12:04.507 { 00:12:04.507 "name": "pt1", 00:12:04.507 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:04.507 "is_configured": true, 00:12:04.507 "data_offset": 2048, 00:12:04.507 "data_size": 63488 00:12:04.507 }, 00:12:04.507 { 00:12:04.507 "name": "pt2", 00:12:04.507 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:04.507 "is_configured": true, 00:12:04.507 "data_offset": 2048, 00:12:04.507 "data_size": 63488 00:12:04.507 } 00:12:04.507 ] 00:12:04.507 } 00:12:04.507 } 00:12:04.507 }' 00:12:04.507 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:04.507 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:04.507 pt2' 00:12:04.507 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:04.507 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:04.507 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:04.766 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:04.766 "name": "pt1", 00:12:04.766 "aliases": [ 00:12:04.766 "00000000-0000-0000-0000-000000000001" 00:12:04.766 ], 00:12:04.766 "product_name": "passthru", 00:12:04.766 "block_size": 512, 00:12:04.766 "num_blocks": 65536, 00:12:04.766 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:04.766 "assigned_rate_limits": { 00:12:04.766 "rw_ios_per_sec": 0, 00:12:04.766 "rw_mbytes_per_sec": 0, 00:12:04.766 "r_mbytes_per_sec": 0, 00:12:04.766 "w_mbytes_per_sec": 0 00:12:04.766 }, 00:12:04.766 "claimed": true, 00:12:04.766 "claim_type": "exclusive_write", 00:12:04.766 "zoned": false, 00:12:04.766 "supported_io_types": { 00:12:04.766 "read": true, 00:12:04.766 "write": true, 00:12:04.766 "unmap": true, 00:12:04.766 "flush": true, 00:12:04.766 "reset": true, 00:12:04.766 "nvme_admin": false, 00:12:04.766 "nvme_io": false, 00:12:04.766 "nvme_io_md": false, 00:12:04.766 "write_zeroes": true, 00:12:04.766 "zcopy": true, 00:12:04.766 "get_zone_info": false, 00:12:04.766 "zone_management": false, 00:12:04.766 "zone_append": false, 00:12:04.766 "compare": false, 00:12:04.766 "compare_and_write": false, 00:12:04.766 "abort": true, 00:12:04.766 "seek_hole": false, 00:12:04.766 "seek_data": false, 00:12:04.766 "copy": true, 00:12:04.766 "nvme_iov_md": false 00:12:04.766 }, 00:12:04.766 "memory_domains": [ 00:12:04.766 { 00:12:04.766 "dma_device_id": "system", 00:12:04.766 "dma_device_type": 1 00:12:04.766 }, 00:12:04.766 { 00:12:04.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.766 "dma_device_type": 2 00:12:04.766 } 00:12:04.766 ], 00:12:04.766 "driver_specific": { 00:12:04.766 "passthru": { 00:12:04.766 "name": "pt1", 00:12:04.766 "base_bdev_name": "malloc1" 00:12:04.766 } 00:12:04.766 } 00:12:04.766 }' 00:12:04.766 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.766 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.766 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:04.766 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.024 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.024 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:05.024 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.024 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.024 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:05.024 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:05.025 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:05.283 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:05.283 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:05.283 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:05.283 00:06:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:05.849 "name": "pt2", 00:12:05.849 "aliases": [ 00:12:05.849 "00000000-0000-0000-0000-000000000002" 00:12:05.849 ], 00:12:05.849 "product_name": "passthru", 00:12:05.849 "block_size": 512, 00:12:05.849 "num_blocks": 65536, 00:12:05.849 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:05.849 "assigned_rate_limits": { 00:12:05.849 "rw_ios_per_sec": 0, 00:12:05.849 "rw_mbytes_per_sec": 0, 00:12:05.849 "r_mbytes_per_sec": 0, 00:12:05.849 "w_mbytes_per_sec": 0 00:12:05.849 }, 00:12:05.849 "claimed": true, 00:12:05.849 "claim_type": "exclusive_write", 00:12:05.849 "zoned": false, 00:12:05.849 "supported_io_types": { 00:12:05.849 "read": true, 00:12:05.849 "write": true, 00:12:05.849 "unmap": true, 00:12:05.849 "flush": true, 00:12:05.849 "reset": true, 00:12:05.849 "nvme_admin": false, 00:12:05.849 "nvme_io": false, 00:12:05.849 "nvme_io_md": false, 00:12:05.849 "write_zeroes": true, 00:12:05.849 "zcopy": true, 00:12:05.849 "get_zone_info": false, 00:12:05.849 "zone_management": false, 00:12:05.849 "zone_append": false, 00:12:05.849 "compare": false, 00:12:05.849 "compare_and_write": false, 00:12:05.849 "abort": true, 00:12:05.849 "seek_hole": false, 00:12:05.849 "seek_data": false, 00:12:05.849 "copy": true, 00:12:05.849 "nvme_iov_md": false 00:12:05.849 }, 00:12:05.849 "memory_domains": [ 00:12:05.849 { 00:12:05.849 "dma_device_id": "system", 00:12:05.849 "dma_device_type": 1 00:12:05.849 }, 00:12:05.849 { 00:12:05.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:05.849 "dma_device_type": 2 00:12:05.849 } 00:12:05.849 ], 00:12:05.849 "driver_specific": { 00:12:05.849 "passthru": { 00:12:05.849 "name": "pt2", 00:12:05.849 "base_bdev_name": "malloc2" 00:12:05.849 } 00:12:05.849 } 00:12:05.849 }' 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:05.849 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:06.108 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:06.108 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:06.108 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:06.108 00:06:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:06.367 [2024-07-16 00:06:53.089146] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d57c2e80-8f14-4cfe-8746-357f2df4bad2 '!=' d57c2e80-8f14-4cfe-8746-357f2df4bad2 ']' 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3497944 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3497944 ']' 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3497944 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3497944 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3497944' 00:12:06.367 killing process with pid 3497944 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3497944 00:12:06.367 [2024-07-16 00:06:53.149043] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:06.367 [2024-07-16 00:06:53.149104] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:06.367 [2024-07-16 00:06:53.149146] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:06.367 [2024-07-16 00:06:53.149158] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b7bec0 name raid_bdev1, state offline 00:12:06.367 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3497944 00:12:06.367 [2024-07-16 00:06:53.165623] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:06.625 00:06:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:06.625 00:12:06.625 real 0m10.999s 00:12:06.625 user 0m19.667s 00:12:06.625 sys 0m2.005s 00:12:06.625 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:06.625 00:06:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:06.625 ************************************ 00:12:06.625 END TEST raid_superblock_test 00:12:06.625 ************************************ 00:12:06.625 00:06:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:06.625 00:06:53 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:06.625 00:06:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:06.625 00:06:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:06.625 00:06:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:06.625 ************************************ 00:12:06.625 START TEST raid_read_error_test 00:12:06.625 ************************************ 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.x6BjzKYSDd 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3499575 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3499575 /var/tmp/spdk-raid.sock 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3499575 ']' 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:06.625 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:06.625 00:06:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:06.625 [2024-07-16 00:06:53.538395] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:12:06.625 [2024-07-16 00:06:53.538467] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3499575 ] 00:12:06.884 [2024-07-16 00:06:53.671198] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:06.884 [2024-07-16 00:06:53.778272] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:07.142 [2024-07-16 00:06:53.853474] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:07.142 [2024-07-16 00:06:53.853514] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:07.710 00:06:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:07.710 00:06:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:07.710 00:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:07.710 00:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:07.969 BaseBdev1_malloc 00:12:07.969 00:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:08.228 true 00:12:08.228 00:06:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:08.228 [2024-07-16 00:06:55.165405] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:08.228 [2024-07-16 00:06:55.165451] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:08.228 [2024-07-16 00:06:55.165471] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11410d0 00:12:08.228 [2024-07-16 00:06:55.165484] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:08.228 [2024-07-16 00:06:55.167346] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:08.228 [2024-07-16 00:06:55.167378] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:08.228 BaseBdev1 00:12:08.487 00:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:08.487 00:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:08.487 BaseBdev2_malloc 00:12:08.487 00:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:08.745 true 00:12:08.745 00:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:09.036 [2024-07-16 00:06:55.921354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:09.036 [2024-07-16 00:06:55.921398] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:09.036 [2024-07-16 00:06:55.921418] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1145910 00:12:09.036 [2024-07-16 00:06:55.921430] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:09.036 [2024-07-16 00:06:55.922983] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:09.036 [2024-07-16 00:06:55.923011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:09.036 BaseBdev2 00:12:09.036 00:06:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:09.295 [2024-07-16 00:06:56.162027] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:09.295 [2024-07-16 00:06:56.163389] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:09.295 [2024-07-16 00:06:56.163578] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1147320 00:12:09.295 [2024-07-16 00:06:56.163591] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:09.295 [2024-07-16 00:06:56.163791] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1148290 00:12:09.295 [2024-07-16 00:06:56.163943] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1147320 00:12:09.295 [2024-07-16 00:06:56.163953] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1147320 00:12:09.295 [2024-07-16 00:06:56.164057] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.295 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:09.554 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.554 "name": "raid_bdev1", 00:12:09.554 "uuid": "16bf9287-1827-439e-a01e-46f131252d4b", 00:12:09.554 "strip_size_kb": 64, 00:12:09.554 "state": "online", 00:12:09.554 "raid_level": "concat", 00:12:09.554 "superblock": true, 00:12:09.554 "num_base_bdevs": 2, 00:12:09.554 "num_base_bdevs_discovered": 2, 00:12:09.554 "num_base_bdevs_operational": 2, 00:12:09.554 "base_bdevs_list": [ 00:12:09.554 { 00:12:09.554 "name": "BaseBdev1", 00:12:09.554 "uuid": "05349813-1c3e-511e-8d89-18dda2141691", 00:12:09.554 "is_configured": true, 00:12:09.554 "data_offset": 2048, 00:12:09.554 "data_size": 63488 00:12:09.554 }, 00:12:09.554 { 00:12:09.554 "name": "BaseBdev2", 00:12:09.554 "uuid": "914cfb1d-f42b-5905-9a48-faf33d3ec852", 00:12:09.554 "is_configured": true, 00:12:09.554 "data_offset": 2048, 00:12:09.554 "data_size": 63488 00:12:09.554 } 00:12:09.554 ] 00:12:09.554 }' 00:12:09.554 00:06:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.554 00:06:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.121 00:06:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:10.121 00:06:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:10.380 [2024-07-16 00:06:57.148893] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11429b0 00:12:11.317 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:11.576 "name": "raid_bdev1", 00:12:11.576 "uuid": "16bf9287-1827-439e-a01e-46f131252d4b", 00:12:11.576 "strip_size_kb": 64, 00:12:11.576 "state": "online", 00:12:11.576 "raid_level": "concat", 00:12:11.576 "superblock": true, 00:12:11.576 "num_base_bdevs": 2, 00:12:11.576 "num_base_bdevs_discovered": 2, 00:12:11.576 "num_base_bdevs_operational": 2, 00:12:11.576 "base_bdevs_list": [ 00:12:11.576 { 00:12:11.576 "name": "BaseBdev1", 00:12:11.576 "uuid": "05349813-1c3e-511e-8d89-18dda2141691", 00:12:11.576 "is_configured": true, 00:12:11.576 "data_offset": 2048, 00:12:11.576 "data_size": 63488 00:12:11.576 }, 00:12:11.576 { 00:12:11.576 "name": "BaseBdev2", 00:12:11.576 "uuid": "914cfb1d-f42b-5905-9a48-faf33d3ec852", 00:12:11.576 "is_configured": true, 00:12:11.576 "data_offset": 2048, 00:12:11.576 "data_size": 63488 00:12:11.576 } 00:12:11.576 ] 00:12:11.576 }' 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:11.576 00:06:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.513 00:06:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:12.513 [2024-07-16 00:06:59.398381] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:12.513 [2024-07-16 00:06:59.398419] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:12.513 [2024-07-16 00:06:59.401570] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:12.513 [2024-07-16 00:06:59.401600] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:12.513 [2024-07-16 00:06:59.401627] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:12.513 [2024-07-16 00:06:59.401638] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1147320 name raid_bdev1, state offline 00:12:12.513 0 00:12:12.513 00:06:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3499575 00:12:12.513 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3499575 ']' 00:12:12.513 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3499575 00:12:12.513 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:12.513 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:12.514 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3499575 00:12:12.772 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:12.772 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:12.772 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3499575' 00:12:12.772 killing process with pid 3499575 00:12:12.772 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3499575 00:12:12.772 [2024-07-16 00:06:59.483973] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:12.772 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3499575 00:12:12.772 [2024-07-16 00:06:59.494759] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:13.031 00:06:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.x6BjzKYSDd 00:12:13.031 00:06:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:13.031 00:06:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:13.031 00:06:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:12:13.031 00:06:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:13.031 00:06:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:13.031 00:06:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:13.031 00:06:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:12:13.031 00:12:13.031 real 0m6.277s 00:12:13.031 user 0m9.807s 00:12:13.031 sys 0m1.104s 00:12:13.031 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:13.031 00:06:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.031 ************************************ 00:12:13.031 END TEST raid_read_error_test 00:12:13.031 ************************************ 00:12:13.031 00:06:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:13.031 00:06:59 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:12:13.031 00:06:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:13.031 00:06:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:13.031 00:06:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:13.031 ************************************ 00:12:13.031 START TEST raid_write_error_test 00:12:13.031 ************************************ 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:13.031 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.yVdEz9MNya 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3500546 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3500546 /var/tmp/spdk-raid.sock 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3500546 ']' 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:13.032 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:13.032 00:06:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.032 [2024-07-16 00:06:59.949126] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:12:13.032 [2024-07-16 00:06:59.949265] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3500546 ] 00:12:13.290 [2024-07-16 00:07:00.147892] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:13.549 [2024-07-16 00:07:00.251542] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:13.549 [2024-07-16 00:07:00.307195] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:13.549 [2024-07-16 00:07:00.307228] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:13.549 00:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:13.549 00:07:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:13.549 00:07:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:13.549 00:07:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:13.808 BaseBdev1_malloc 00:12:13.808 00:07:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:14.067 true 00:12:14.067 00:07:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:14.326 [2024-07-16 00:07:01.085064] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:14.326 [2024-07-16 00:07:01.085109] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:14.326 [2024-07-16 00:07:01.085130] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x225b0d0 00:12:14.326 [2024-07-16 00:07:01.085148] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:14.326 [2024-07-16 00:07:01.087043] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:14.326 [2024-07-16 00:07:01.087075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:14.326 BaseBdev1 00:12:14.326 00:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:14.326 00:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:14.585 BaseBdev2_malloc 00:12:14.586 00:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:14.845 true 00:12:14.845 00:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:15.105 [2024-07-16 00:07:01.820812] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:15.105 [2024-07-16 00:07:01.820856] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:15.105 [2024-07-16 00:07:01.820876] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x225f910 00:12:15.105 [2024-07-16 00:07:01.820889] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:15.105 [2024-07-16 00:07:01.822455] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:15.105 [2024-07-16 00:07:01.822485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:15.105 BaseBdev2 00:12:15.105 00:07:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:15.105 [2024-07-16 00:07:02.053461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:15.105 [2024-07-16 00:07:02.054813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:15.105 [2024-07-16 00:07:02.055012] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2261320 00:12:15.105 [2024-07-16 00:07:02.055025] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:15.105 [2024-07-16 00:07:02.055226] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2262290 00:12:15.105 [2024-07-16 00:07:02.055376] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2261320 00:12:15.105 [2024-07-16 00:07:02.055387] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2261320 00:12:15.363 [2024-07-16 00:07:02.055496] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:15.363 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.622 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.622 "name": "raid_bdev1", 00:12:15.622 "uuid": "d24383c1-3af7-4c44-988a-4bb141988567", 00:12:15.622 "strip_size_kb": 64, 00:12:15.622 "state": "online", 00:12:15.622 "raid_level": "concat", 00:12:15.622 "superblock": true, 00:12:15.622 "num_base_bdevs": 2, 00:12:15.622 "num_base_bdevs_discovered": 2, 00:12:15.622 "num_base_bdevs_operational": 2, 00:12:15.622 "base_bdevs_list": [ 00:12:15.622 { 00:12:15.622 "name": "BaseBdev1", 00:12:15.622 "uuid": "27edca5f-82f2-54ae-a614-bc32c7a31a53", 00:12:15.622 "is_configured": true, 00:12:15.622 "data_offset": 2048, 00:12:15.622 "data_size": 63488 00:12:15.622 }, 00:12:15.622 { 00:12:15.622 "name": "BaseBdev2", 00:12:15.622 "uuid": "460cd8a8-df27-5124-8df2-51a20dfd56dc", 00:12:15.622 "is_configured": true, 00:12:15.622 "data_offset": 2048, 00:12:15.622 "data_size": 63488 00:12:15.622 } 00:12:15.622 ] 00:12:15.622 }' 00:12:15.622 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.622 00:07:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.190 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:16.190 00:07:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:16.190 [2024-07-16 00:07:03.036340] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x225c9b0 00:12:17.126 00:07:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.385 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:17.643 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.643 "name": "raid_bdev1", 00:12:17.643 "uuid": "d24383c1-3af7-4c44-988a-4bb141988567", 00:12:17.643 "strip_size_kb": 64, 00:12:17.643 "state": "online", 00:12:17.643 "raid_level": "concat", 00:12:17.643 "superblock": true, 00:12:17.643 "num_base_bdevs": 2, 00:12:17.643 "num_base_bdevs_discovered": 2, 00:12:17.643 "num_base_bdevs_operational": 2, 00:12:17.643 "base_bdevs_list": [ 00:12:17.643 { 00:12:17.643 "name": "BaseBdev1", 00:12:17.643 "uuid": "27edca5f-82f2-54ae-a614-bc32c7a31a53", 00:12:17.643 "is_configured": true, 00:12:17.643 "data_offset": 2048, 00:12:17.643 "data_size": 63488 00:12:17.643 }, 00:12:17.643 { 00:12:17.643 "name": "BaseBdev2", 00:12:17.643 "uuid": "460cd8a8-df27-5124-8df2-51a20dfd56dc", 00:12:17.643 "is_configured": true, 00:12:17.643 "data_offset": 2048, 00:12:17.643 "data_size": 63488 00:12:17.643 } 00:12:17.643 ] 00:12:17.643 }' 00:12:17.643 00:07:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.643 00:07:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.581 00:07:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:18.841 [2024-07-16 00:07:05.540643] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:18.841 [2024-07-16 00:07:05.540676] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:18.841 [2024-07-16 00:07:05.543833] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:18.841 [2024-07-16 00:07:05.543862] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:18.841 [2024-07-16 00:07:05.543888] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:18.841 [2024-07-16 00:07:05.543899] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2261320 name raid_bdev1, state offline 00:12:18.841 0 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3500546 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3500546 ']' 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3500546 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3500546 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3500546' 00:12:18.841 killing process with pid 3500546 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3500546 00:12:18.841 [2024-07-16 00:07:05.626475] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:18.841 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3500546 00:12:18.841 [2024-07-16 00:07:05.637271] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:19.102 00:07:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.yVdEz9MNya 00:12:19.102 00:07:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:19.102 00:07:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:19.102 00:07:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:12:19.102 00:07:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:19.102 00:07:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:19.102 00:07:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:19.102 00:07:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:12:19.102 00:12:19.102 real 0m6.045s 00:12:19.102 user 0m9.835s 00:12:19.102 sys 0m1.130s 00:12:19.102 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:19.102 00:07:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.102 ************************************ 00:12:19.102 END TEST raid_write_error_test 00:12:19.102 ************************************ 00:12:19.102 00:07:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:19.102 00:07:05 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:19.102 00:07:05 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:12:19.102 00:07:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:19.102 00:07:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:19.102 00:07:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:19.102 ************************************ 00:12:19.102 START TEST raid_state_function_test 00:12:19.102 ************************************ 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3501386 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3501386' 00:12:19.102 Process raid pid: 3501386 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3501386 /var/tmp/spdk-raid.sock 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3501386 ']' 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:19.102 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:19.102 00:07:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.102 [2024-07-16 00:07:06.031222] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:12:19.102 [2024-07-16 00:07:06.031295] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:19.361 [2024-07-16 00:07:06.164279] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:19.361 [2024-07-16 00:07:06.270900] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:19.619 [2024-07-16 00:07:06.339571] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:19.619 [2024-07-16 00:07:06.339631] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:20.184 00:07:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:20.184 00:07:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:20.184 00:07:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:20.442 [2024-07-16 00:07:07.198541] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:20.442 [2024-07-16 00:07:07.198581] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:20.442 [2024-07-16 00:07:07.198591] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:20.442 [2024-07-16 00:07:07.198603] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.442 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:20.700 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.700 "name": "Existed_Raid", 00:12:20.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.700 "strip_size_kb": 0, 00:12:20.700 "state": "configuring", 00:12:20.700 "raid_level": "raid1", 00:12:20.700 "superblock": false, 00:12:20.700 "num_base_bdevs": 2, 00:12:20.700 "num_base_bdevs_discovered": 0, 00:12:20.700 "num_base_bdevs_operational": 2, 00:12:20.700 "base_bdevs_list": [ 00:12:20.700 { 00:12:20.700 "name": "BaseBdev1", 00:12:20.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.700 "is_configured": false, 00:12:20.700 "data_offset": 0, 00:12:20.700 "data_size": 0 00:12:20.700 }, 00:12:20.700 { 00:12:20.700 "name": "BaseBdev2", 00:12:20.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.700 "is_configured": false, 00:12:20.700 "data_offset": 0, 00:12:20.700 "data_size": 0 00:12:20.700 } 00:12:20.700 ] 00:12:20.700 }' 00:12:20.700 00:07:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.700 00:07:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.265 00:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:21.523 [2024-07-16 00:07:08.297356] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:21.523 [2024-07-16 00:07:08.297386] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b38a80 name Existed_Raid, state configuring 00:12:21.523 00:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:21.781 [2024-07-16 00:07:08.546016] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:21.781 [2024-07-16 00:07:08.546047] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:21.781 [2024-07-16 00:07:08.546061] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:21.781 [2024-07-16 00:07:08.546073] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:21.781 00:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:22.038 [2024-07-16 00:07:08.804510] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:22.039 BaseBdev1 00:12:22.039 00:07:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:22.039 00:07:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:22.039 00:07:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:22.039 00:07:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:22.039 00:07:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:22.039 00:07:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:22.039 00:07:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:22.296 00:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:22.296 [ 00:12:22.296 { 00:12:22.296 "name": "BaseBdev1", 00:12:22.296 "aliases": [ 00:12:22.296 "f916d54e-fd9c-4701-86f3-0a9892efbb8b" 00:12:22.296 ], 00:12:22.296 "product_name": "Malloc disk", 00:12:22.296 "block_size": 512, 00:12:22.296 "num_blocks": 65536, 00:12:22.296 "uuid": "f916d54e-fd9c-4701-86f3-0a9892efbb8b", 00:12:22.296 "assigned_rate_limits": { 00:12:22.296 "rw_ios_per_sec": 0, 00:12:22.296 "rw_mbytes_per_sec": 0, 00:12:22.296 "r_mbytes_per_sec": 0, 00:12:22.296 "w_mbytes_per_sec": 0 00:12:22.296 }, 00:12:22.296 "claimed": true, 00:12:22.296 "claim_type": "exclusive_write", 00:12:22.296 "zoned": false, 00:12:22.296 "supported_io_types": { 00:12:22.296 "read": true, 00:12:22.296 "write": true, 00:12:22.296 "unmap": true, 00:12:22.296 "flush": true, 00:12:22.296 "reset": true, 00:12:22.296 "nvme_admin": false, 00:12:22.296 "nvme_io": false, 00:12:22.296 "nvme_io_md": false, 00:12:22.296 "write_zeroes": true, 00:12:22.296 "zcopy": true, 00:12:22.296 "get_zone_info": false, 00:12:22.296 "zone_management": false, 00:12:22.296 "zone_append": false, 00:12:22.296 "compare": false, 00:12:22.296 "compare_and_write": false, 00:12:22.296 "abort": true, 00:12:22.296 "seek_hole": false, 00:12:22.296 "seek_data": false, 00:12:22.296 "copy": true, 00:12:22.296 "nvme_iov_md": false 00:12:22.296 }, 00:12:22.296 "memory_domains": [ 00:12:22.296 { 00:12:22.296 "dma_device_id": "system", 00:12:22.296 "dma_device_type": 1 00:12:22.296 }, 00:12:22.296 { 00:12:22.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.296 "dma_device_type": 2 00:12:22.296 } 00:12:22.296 ], 00:12:22.296 "driver_specific": {} 00:12:22.296 } 00:12:22.296 ] 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.554 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.811 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.811 "name": "Existed_Raid", 00:12:22.811 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.811 "strip_size_kb": 0, 00:12:22.811 "state": "configuring", 00:12:22.811 "raid_level": "raid1", 00:12:22.811 "superblock": false, 00:12:22.811 "num_base_bdevs": 2, 00:12:22.811 "num_base_bdevs_discovered": 1, 00:12:22.811 "num_base_bdevs_operational": 2, 00:12:22.812 "base_bdevs_list": [ 00:12:22.812 { 00:12:22.812 "name": "BaseBdev1", 00:12:22.812 "uuid": "f916d54e-fd9c-4701-86f3-0a9892efbb8b", 00:12:22.812 "is_configured": true, 00:12:22.812 "data_offset": 0, 00:12:22.812 "data_size": 65536 00:12:22.812 }, 00:12:22.812 { 00:12:22.812 "name": "BaseBdev2", 00:12:22.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.812 "is_configured": false, 00:12:22.812 "data_offset": 0, 00:12:22.812 "data_size": 0 00:12:22.812 } 00:12:22.812 ] 00:12:22.812 }' 00:12:22.812 00:07:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.812 00:07:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.509 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:23.509 [2024-07-16 00:07:10.348598] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:23.509 [2024-07-16 00:07:10.348639] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b38350 name Existed_Raid, state configuring 00:12:23.509 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:23.783 [2024-07-16 00:07:10.589251] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:23.783 [2024-07-16 00:07:10.590782] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:23.783 [2024-07-16 00:07:10.590815] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.783 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:24.041 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:24.041 "name": "Existed_Raid", 00:12:24.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.041 "strip_size_kb": 0, 00:12:24.041 "state": "configuring", 00:12:24.041 "raid_level": "raid1", 00:12:24.041 "superblock": false, 00:12:24.041 "num_base_bdevs": 2, 00:12:24.041 "num_base_bdevs_discovered": 1, 00:12:24.041 "num_base_bdevs_operational": 2, 00:12:24.041 "base_bdevs_list": [ 00:12:24.041 { 00:12:24.041 "name": "BaseBdev1", 00:12:24.041 "uuid": "f916d54e-fd9c-4701-86f3-0a9892efbb8b", 00:12:24.041 "is_configured": true, 00:12:24.041 "data_offset": 0, 00:12:24.041 "data_size": 65536 00:12:24.041 }, 00:12:24.041 { 00:12:24.041 "name": "BaseBdev2", 00:12:24.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.041 "is_configured": false, 00:12:24.041 "data_offset": 0, 00:12:24.041 "data_size": 0 00:12:24.041 } 00:12:24.041 ] 00:12:24.041 }' 00:12:24.041 00:07:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:24.041 00:07:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.977 00:07:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:25.236 [2024-07-16 00:07:11.944270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:25.236 [2024-07-16 00:07:11.944308] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b39000 00:12:25.236 [2024-07-16 00:07:11.944316] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:25.236 [2024-07-16 00:07:11.944504] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a530c0 00:12:25.236 [2024-07-16 00:07:11.944621] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b39000 00:12:25.236 [2024-07-16 00:07:11.944631] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b39000 00:12:25.236 [2024-07-16 00:07:11.944789] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:25.236 BaseBdev2 00:12:25.236 00:07:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:25.236 00:07:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:25.236 00:07:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:25.236 00:07:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:25.236 00:07:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:25.236 00:07:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:25.236 00:07:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:25.495 00:07:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:25.754 [ 00:12:25.754 { 00:12:25.754 "name": "BaseBdev2", 00:12:25.754 "aliases": [ 00:12:25.754 "a062eb79-bc74-44c2-ae63-b0f56d0c6df6" 00:12:25.754 ], 00:12:25.754 "product_name": "Malloc disk", 00:12:25.754 "block_size": 512, 00:12:25.754 "num_blocks": 65536, 00:12:25.754 "uuid": "a062eb79-bc74-44c2-ae63-b0f56d0c6df6", 00:12:25.754 "assigned_rate_limits": { 00:12:25.754 "rw_ios_per_sec": 0, 00:12:25.754 "rw_mbytes_per_sec": 0, 00:12:25.754 "r_mbytes_per_sec": 0, 00:12:25.754 "w_mbytes_per_sec": 0 00:12:25.754 }, 00:12:25.754 "claimed": true, 00:12:25.754 "claim_type": "exclusive_write", 00:12:25.754 "zoned": false, 00:12:25.754 "supported_io_types": { 00:12:25.754 "read": true, 00:12:25.754 "write": true, 00:12:25.754 "unmap": true, 00:12:25.754 "flush": true, 00:12:25.754 "reset": true, 00:12:25.754 "nvme_admin": false, 00:12:25.754 "nvme_io": false, 00:12:25.754 "nvme_io_md": false, 00:12:25.754 "write_zeroes": true, 00:12:25.754 "zcopy": true, 00:12:25.754 "get_zone_info": false, 00:12:25.754 "zone_management": false, 00:12:25.754 "zone_append": false, 00:12:25.754 "compare": false, 00:12:25.754 "compare_and_write": false, 00:12:25.754 "abort": true, 00:12:25.754 "seek_hole": false, 00:12:25.754 "seek_data": false, 00:12:25.754 "copy": true, 00:12:25.754 "nvme_iov_md": false 00:12:25.754 }, 00:12:25.754 "memory_domains": [ 00:12:25.754 { 00:12:25.754 "dma_device_id": "system", 00:12:25.754 "dma_device_type": 1 00:12:25.754 }, 00:12:25.754 { 00:12:25.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.754 "dma_device_type": 2 00:12:25.754 } 00:12:25.754 ], 00:12:25.754 "driver_specific": {} 00:12:25.754 } 00:12:25.754 ] 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.754 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.014 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.014 "name": "Existed_Raid", 00:12:26.014 "uuid": "94e20f5a-c3d7-4555-8ac5-2d590d1759ad", 00:12:26.014 "strip_size_kb": 0, 00:12:26.014 "state": "online", 00:12:26.014 "raid_level": "raid1", 00:12:26.014 "superblock": false, 00:12:26.014 "num_base_bdevs": 2, 00:12:26.014 "num_base_bdevs_discovered": 2, 00:12:26.014 "num_base_bdevs_operational": 2, 00:12:26.014 "base_bdevs_list": [ 00:12:26.014 { 00:12:26.014 "name": "BaseBdev1", 00:12:26.014 "uuid": "f916d54e-fd9c-4701-86f3-0a9892efbb8b", 00:12:26.014 "is_configured": true, 00:12:26.014 "data_offset": 0, 00:12:26.014 "data_size": 65536 00:12:26.014 }, 00:12:26.014 { 00:12:26.014 "name": "BaseBdev2", 00:12:26.014 "uuid": "a062eb79-bc74-44c2-ae63-b0f56d0c6df6", 00:12:26.014 "is_configured": true, 00:12:26.014 "data_offset": 0, 00:12:26.014 "data_size": 65536 00:12:26.014 } 00:12:26.014 ] 00:12:26.014 }' 00:12:26.014 00:07:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.014 00:07:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.582 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:26.582 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:26.582 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:26.582 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:26.582 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:26.582 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:26.582 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:26.582 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:26.582 [2024-07-16 00:07:13.528743] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:26.840 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:26.840 "name": "Existed_Raid", 00:12:26.840 "aliases": [ 00:12:26.840 "94e20f5a-c3d7-4555-8ac5-2d590d1759ad" 00:12:26.840 ], 00:12:26.840 "product_name": "Raid Volume", 00:12:26.840 "block_size": 512, 00:12:26.840 "num_blocks": 65536, 00:12:26.840 "uuid": "94e20f5a-c3d7-4555-8ac5-2d590d1759ad", 00:12:26.840 "assigned_rate_limits": { 00:12:26.840 "rw_ios_per_sec": 0, 00:12:26.840 "rw_mbytes_per_sec": 0, 00:12:26.840 "r_mbytes_per_sec": 0, 00:12:26.840 "w_mbytes_per_sec": 0 00:12:26.840 }, 00:12:26.840 "claimed": false, 00:12:26.840 "zoned": false, 00:12:26.840 "supported_io_types": { 00:12:26.840 "read": true, 00:12:26.840 "write": true, 00:12:26.840 "unmap": false, 00:12:26.840 "flush": false, 00:12:26.840 "reset": true, 00:12:26.840 "nvme_admin": false, 00:12:26.840 "nvme_io": false, 00:12:26.840 "nvme_io_md": false, 00:12:26.840 "write_zeroes": true, 00:12:26.840 "zcopy": false, 00:12:26.840 "get_zone_info": false, 00:12:26.840 "zone_management": false, 00:12:26.840 "zone_append": false, 00:12:26.840 "compare": false, 00:12:26.840 "compare_and_write": false, 00:12:26.840 "abort": false, 00:12:26.840 "seek_hole": false, 00:12:26.840 "seek_data": false, 00:12:26.840 "copy": false, 00:12:26.840 "nvme_iov_md": false 00:12:26.840 }, 00:12:26.840 "memory_domains": [ 00:12:26.840 { 00:12:26.840 "dma_device_id": "system", 00:12:26.840 "dma_device_type": 1 00:12:26.840 }, 00:12:26.840 { 00:12:26.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:26.840 "dma_device_type": 2 00:12:26.840 }, 00:12:26.840 { 00:12:26.840 "dma_device_id": "system", 00:12:26.840 "dma_device_type": 1 00:12:26.840 }, 00:12:26.840 { 00:12:26.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:26.840 "dma_device_type": 2 00:12:26.840 } 00:12:26.840 ], 00:12:26.840 "driver_specific": { 00:12:26.840 "raid": { 00:12:26.840 "uuid": "94e20f5a-c3d7-4555-8ac5-2d590d1759ad", 00:12:26.840 "strip_size_kb": 0, 00:12:26.840 "state": "online", 00:12:26.840 "raid_level": "raid1", 00:12:26.840 "superblock": false, 00:12:26.840 "num_base_bdevs": 2, 00:12:26.840 "num_base_bdevs_discovered": 2, 00:12:26.840 "num_base_bdevs_operational": 2, 00:12:26.840 "base_bdevs_list": [ 00:12:26.840 { 00:12:26.840 "name": "BaseBdev1", 00:12:26.840 "uuid": "f916d54e-fd9c-4701-86f3-0a9892efbb8b", 00:12:26.840 "is_configured": true, 00:12:26.840 "data_offset": 0, 00:12:26.840 "data_size": 65536 00:12:26.840 }, 00:12:26.840 { 00:12:26.840 "name": "BaseBdev2", 00:12:26.840 "uuid": "a062eb79-bc74-44c2-ae63-b0f56d0c6df6", 00:12:26.840 "is_configured": true, 00:12:26.840 "data_offset": 0, 00:12:26.840 "data_size": 65536 00:12:26.840 } 00:12:26.840 ] 00:12:26.840 } 00:12:26.840 } 00:12:26.840 }' 00:12:26.840 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:26.840 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:26.840 BaseBdev2' 00:12:26.840 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:26.840 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:26.840 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:26.840 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:26.840 "name": "BaseBdev1", 00:12:26.840 "aliases": [ 00:12:26.840 "f916d54e-fd9c-4701-86f3-0a9892efbb8b" 00:12:26.840 ], 00:12:26.840 "product_name": "Malloc disk", 00:12:26.840 "block_size": 512, 00:12:26.840 "num_blocks": 65536, 00:12:26.840 "uuid": "f916d54e-fd9c-4701-86f3-0a9892efbb8b", 00:12:26.840 "assigned_rate_limits": { 00:12:26.840 "rw_ios_per_sec": 0, 00:12:26.840 "rw_mbytes_per_sec": 0, 00:12:26.840 "r_mbytes_per_sec": 0, 00:12:26.840 "w_mbytes_per_sec": 0 00:12:26.840 }, 00:12:26.840 "claimed": true, 00:12:26.840 "claim_type": "exclusive_write", 00:12:26.840 "zoned": false, 00:12:26.840 "supported_io_types": { 00:12:26.840 "read": true, 00:12:26.840 "write": true, 00:12:26.840 "unmap": true, 00:12:26.840 "flush": true, 00:12:26.840 "reset": true, 00:12:26.840 "nvme_admin": false, 00:12:26.840 "nvme_io": false, 00:12:26.840 "nvme_io_md": false, 00:12:26.840 "write_zeroes": true, 00:12:26.840 "zcopy": true, 00:12:26.840 "get_zone_info": false, 00:12:26.840 "zone_management": false, 00:12:26.840 "zone_append": false, 00:12:26.840 "compare": false, 00:12:26.840 "compare_and_write": false, 00:12:26.840 "abort": true, 00:12:26.840 "seek_hole": false, 00:12:26.840 "seek_data": false, 00:12:26.840 "copy": true, 00:12:26.840 "nvme_iov_md": false 00:12:26.840 }, 00:12:26.840 "memory_domains": [ 00:12:26.840 { 00:12:26.840 "dma_device_id": "system", 00:12:26.840 "dma_device_type": 1 00:12:26.840 }, 00:12:26.840 { 00:12:26.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:26.840 "dma_device_type": 2 00:12:26.840 } 00:12:26.840 ], 00:12:26.840 "driver_specific": {} 00:12:26.840 }' 00:12:26.840 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:27.098 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:27.098 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:27.098 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:27.098 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:27.098 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:27.098 00:07:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:27.098 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:27.356 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:27.356 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:27.356 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:27.356 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:27.356 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:27.356 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:27.356 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:27.614 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:27.614 "name": "BaseBdev2", 00:12:27.614 "aliases": [ 00:12:27.614 "a062eb79-bc74-44c2-ae63-b0f56d0c6df6" 00:12:27.614 ], 00:12:27.614 "product_name": "Malloc disk", 00:12:27.614 "block_size": 512, 00:12:27.614 "num_blocks": 65536, 00:12:27.614 "uuid": "a062eb79-bc74-44c2-ae63-b0f56d0c6df6", 00:12:27.614 "assigned_rate_limits": { 00:12:27.614 "rw_ios_per_sec": 0, 00:12:27.614 "rw_mbytes_per_sec": 0, 00:12:27.614 "r_mbytes_per_sec": 0, 00:12:27.614 "w_mbytes_per_sec": 0 00:12:27.614 }, 00:12:27.614 "claimed": true, 00:12:27.614 "claim_type": "exclusive_write", 00:12:27.614 "zoned": false, 00:12:27.614 "supported_io_types": { 00:12:27.614 "read": true, 00:12:27.614 "write": true, 00:12:27.614 "unmap": true, 00:12:27.614 "flush": true, 00:12:27.614 "reset": true, 00:12:27.614 "nvme_admin": false, 00:12:27.614 "nvme_io": false, 00:12:27.614 "nvme_io_md": false, 00:12:27.614 "write_zeroes": true, 00:12:27.614 "zcopy": true, 00:12:27.614 "get_zone_info": false, 00:12:27.614 "zone_management": false, 00:12:27.614 "zone_append": false, 00:12:27.614 "compare": false, 00:12:27.614 "compare_and_write": false, 00:12:27.614 "abort": true, 00:12:27.614 "seek_hole": false, 00:12:27.614 "seek_data": false, 00:12:27.614 "copy": true, 00:12:27.614 "nvme_iov_md": false 00:12:27.614 }, 00:12:27.614 "memory_domains": [ 00:12:27.614 { 00:12:27.614 "dma_device_id": "system", 00:12:27.614 "dma_device_type": 1 00:12:27.614 }, 00:12:27.614 { 00:12:27.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.614 "dma_device_type": 2 00:12:27.614 } 00:12:27.614 ], 00:12:27.614 "driver_specific": {} 00:12:27.614 }' 00:12:27.614 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:27.614 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:27.614 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:27.614 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:27.614 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:27.614 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:27.614 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:27.614 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:27.873 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:27.873 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:27.873 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:27.873 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:27.873 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:28.132 [2024-07-16 00:07:14.888133] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.132 00:07:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.391 00:07:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.391 "name": "Existed_Raid", 00:12:28.391 "uuid": "94e20f5a-c3d7-4555-8ac5-2d590d1759ad", 00:12:28.391 "strip_size_kb": 0, 00:12:28.391 "state": "online", 00:12:28.391 "raid_level": "raid1", 00:12:28.391 "superblock": false, 00:12:28.391 "num_base_bdevs": 2, 00:12:28.391 "num_base_bdevs_discovered": 1, 00:12:28.391 "num_base_bdevs_operational": 1, 00:12:28.391 "base_bdevs_list": [ 00:12:28.391 { 00:12:28.391 "name": null, 00:12:28.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.391 "is_configured": false, 00:12:28.391 "data_offset": 0, 00:12:28.391 "data_size": 65536 00:12:28.391 }, 00:12:28.391 { 00:12:28.391 "name": "BaseBdev2", 00:12:28.391 "uuid": "a062eb79-bc74-44c2-ae63-b0f56d0c6df6", 00:12:28.391 "is_configured": true, 00:12:28.391 "data_offset": 0, 00:12:28.391 "data_size": 65536 00:12:28.391 } 00:12:28.391 ] 00:12:28.391 }' 00:12:28.391 00:07:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.391 00:07:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:28.958 00:07:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:28.958 00:07:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:28.958 00:07:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.958 00:07:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:29.217 00:07:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:29.217 00:07:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:29.217 00:07:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:29.477 [2024-07-16 00:07:16.221506] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:29.477 [2024-07-16 00:07:16.221582] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:29.477 [2024-07-16 00:07:16.234087] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:29.477 [2024-07-16 00:07:16.234126] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:29.477 [2024-07-16 00:07:16.234137] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b39000 name Existed_Raid, state offline 00:12:29.477 00:07:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:29.477 00:07:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:29.477 00:07:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.477 00:07:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3501386 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3501386 ']' 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3501386 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3501386 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3501386' 00:12:29.736 killing process with pid 3501386 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3501386 00:12:29.736 [2024-07-16 00:07:16.540802] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:29.736 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3501386 00:12:29.736 [2024-07-16 00:07:16.541779] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:29.996 00:12:29.996 real 0m10.808s 00:12:29.996 user 0m19.226s 00:12:29.996 sys 0m1.997s 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.996 ************************************ 00:12:29.996 END TEST raid_state_function_test 00:12:29.996 ************************************ 00:12:29.996 00:07:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:29.996 00:07:16 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:12:29.996 00:07:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:29.996 00:07:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:29.996 00:07:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:29.996 ************************************ 00:12:29.996 START TEST raid_state_function_test_sb 00:12:29.996 ************************************ 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3503068 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3503068' 00:12:29.996 Process raid pid: 3503068 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3503068 /var/tmp/spdk-raid.sock 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3503068 ']' 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:29.996 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:29.996 00:07:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:29.996 [2024-07-16 00:07:16.925775] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:12:29.996 [2024-07-16 00:07:16.925848] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:30.255 [2024-07-16 00:07:17.058702] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:30.255 [2024-07-16 00:07:17.161669] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:30.513 [2024-07-16 00:07:17.221771] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:30.513 [2024-07-16 00:07:17.221803] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:31.080 00:07:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:31.080 00:07:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:31.080 00:07:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:31.080 [2024-07-16 00:07:18.014268] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:31.080 [2024-07-16 00:07:18.014312] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:31.080 [2024-07-16 00:07:18.014323] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:31.080 [2024-07-16 00:07:18.014335] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.339 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:31.598 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:31.598 "name": "Existed_Raid", 00:12:31.598 "uuid": "1f32782a-6938-434c-bd01-f7d1f6cdf001", 00:12:31.598 "strip_size_kb": 0, 00:12:31.598 "state": "configuring", 00:12:31.598 "raid_level": "raid1", 00:12:31.598 "superblock": true, 00:12:31.598 "num_base_bdevs": 2, 00:12:31.598 "num_base_bdevs_discovered": 0, 00:12:31.598 "num_base_bdevs_operational": 2, 00:12:31.598 "base_bdevs_list": [ 00:12:31.598 { 00:12:31.598 "name": "BaseBdev1", 00:12:31.598 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:31.598 "is_configured": false, 00:12:31.598 "data_offset": 0, 00:12:31.598 "data_size": 0 00:12:31.598 }, 00:12:31.598 { 00:12:31.598 "name": "BaseBdev2", 00:12:31.598 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:31.598 "is_configured": false, 00:12:31.598 "data_offset": 0, 00:12:31.598 "data_size": 0 00:12:31.598 } 00:12:31.598 ] 00:12:31.598 }' 00:12:31.598 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:31.598 00:07:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:32.164 00:07:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:32.164 [2024-07-16 00:07:19.096991] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:32.164 [2024-07-16 00:07:19.097024] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2674a80 name Existed_Raid, state configuring 00:12:32.423 00:07:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:32.423 [2024-07-16 00:07:19.345670] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:32.423 [2024-07-16 00:07:19.345701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:32.423 [2024-07-16 00:07:19.345711] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:32.423 [2024-07-16 00:07:19.345722] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:32.423 00:07:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:32.681 [2024-07-16 00:07:19.604331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:32.681 BaseBdev1 00:12:32.681 00:07:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:32.681 00:07:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:32.681 00:07:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:32.681 00:07:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:32.681 00:07:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:32.681 00:07:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:32.681 00:07:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:32.939 00:07:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:33.197 [ 00:12:33.197 { 00:12:33.197 "name": "BaseBdev1", 00:12:33.197 "aliases": [ 00:12:33.197 "369aea59-cf10-4459-8e2b-2a478db0f434" 00:12:33.197 ], 00:12:33.197 "product_name": "Malloc disk", 00:12:33.197 "block_size": 512, 00:12:33.197 "num_blocks": 65536, 00:12:33.197 "uuid": "369aea59-cf10-4459-8e2b-2a478db0f434", 00:12:33.197 "assigned_rate_limits": { 00:12:33.197 "rw_ios_per_sec": 0, 00:12:33.197 "rw_mbytes_per_sec": 0, 00:12:33.197 "r_mbytes_per_sec": 0, 00:12:33.197 "w_mbytes_per_sec": 0 00:12:33.197 }, 00:12:33.197 "claimed": true, 00:12:33.197 "claim_type": "exclusive_write", 00:12:33.197 "zoned": false, 00:12:33.197 "supported_io_types": { 00:12:33.197 "read": true, 00:12:33.197 "write": true, 00:12:33.197 "unmap": true, 00:12:33.197 "flush": true, 00:12:33.197 "reset": true, 00:12:33.197 "nvme_admin": false, 00:12:33.197 "nvme_io": false, 00:12:33.197 "nvme_io_md": false, 00:12:33.197 "write_zeroes": true, 00:12:33.197 "zcopy": true, 00:12:33.197 "get_zone_info": false, 00:12:33.197 "zone_management": false, 00:12:33.197 "zone_append": false, 00:12:33.197 "compare": false, 00:12:33.197 "compare_and_write": false, 00:12:33.197 "abort": true, 00:12:33.197 "seek_hole": false, 00:12:33.197 "seek_data": false, 00:12:33.197 "copy": true, 00:12:33.197 "nvme_iov_md": false 00:12:33.197 }, 00:12:33.197 "memory_domains": [ 00:12:33.197 { 00:12:33.197 "dma_device_id": "system", 00:12:33.197 "dma_device_type": 1 00:12:33.197 }, 00:12:33.197 { 00:12:33.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.197 "dma_device_type": 2 00:12:33.197 } 00:12:33.197 ], 00:12:33.197 "driver_specific": {} 00:12:33.197 } 00:12:33.198 ] 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.198 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.456 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.456 "name": "Existed_Raid", 00:12:33.456 "uuid": "bbd52b1c-8ce9-4048-a09b-4dacfa0fdd65", 00:12:33.456 "strip_size_kb": 0, 00:12:33.456 "state": "configuring", 00:12:33.456 "raid_level": "raid1", 00:12:33.456 "superblock": true, 00:12:33.456 "num_base_bdevs": 2, 00:12:33.456 "num_base_bdevs_discovered": 1, 00:12:33.456 "num_base_bdevs_operational": 2, 00:12:33.456 "base_bdevs_list": [ 00:12:33.456 { 00:12:33.456 "name": "BaseBdev1", 00:12:33.456 "uuid": "369aea59-cf10-4459-8e2b-2a478db0f434", 00:12:33.456 "is_configured": true, 00:12:33.456 "data_offset": 2048, 00:12:33.456 "data_size": 63488 00:12:33.456 }, 00:12:33.456 { 00:12:33.456 "name": "BaseBdev2", 00:12:33.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.456 "is_configured": false, 00:12:33.456 "data_offset": 0, 00:12:33.456 "data_size": 0 00:12:33.456 } 00:12:33.456 ] 00:12:33.456 }' 00:12:33.456 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.456 00:07:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:34.392 00:07:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:34.392 [2024-07-16 00:07:21.200556] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:34.392 [2024-07-16 00:07:21.200601] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2674350 name Existed_Raid, state configuring 00:12:34.392 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:34.650 [2024-07-16 00:07:21.445246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:34.650 [2024-07-16 00:07:21.446789] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:34.650 [2024-07-16 00:07:21.446826] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:34.650 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.909 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:34.909 "name": "Existed_Raid", 00:12:34.909 "uuid": "d97cd992-360c-4509-b9bb-6c0fe2d5a6a8", 00:12:34.909 "strip_size_kb": 0, 00:12:34.909 "state": "configuring", 00:12:34.909 "raid_level": "raid1", 00:12:34.909 "superblock": true, 00:12:34.909 "num_base_bdevs": 2, 00:12:34.909 "num_base_bdevs_discovered": 1, 00:12:34.909 "num_base_bdevs_operational": 2, 00:12:34.909 "base_bdevs_list": [ 00:12:34.909 { 00:12:34.909 "name": "BaseBdev1", 00:12:34.909 "uuid": "369aea59-cf10-4459-8e2b-2a478db0f434", 00:12:34.909 "is_configured": true, 00:12:34.909 "data_offset": 2048, 00:12:34.909 "data_size": 63488 00:12:34.909 }, 00:12:34.909 { 00:12:34.909 "name": "BaseBdev2", 00:12:34.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.909 "is_configured": false, 00:12:34.909 "data_offset": 0, 00:12:34.909 "data_size": 0 00:12:34.909 } 00:12:34.909 ] 00:12:34.909 }' 00:12:34.909 00:07:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:34.909 00:07:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:35.477 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:35.477 [2024-07-16 00:07:22.314904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:35.477 [2024-07-16 00:07:22.315064] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2675000 00:12:35.477 [2024-07-16 00:07:22.315078] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:35.477 [2024-07-16 00:07:22.315257] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x258f0c0 00:12:35.477 [2024-07-16 00:07:22.315383] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2675000 00:12:35.477 [2024-07-16 00:07:22.315394] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2675000 00:12:35.477 [2024-07-16 00:07:22.315487] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:35.477 BaseBdev2 00:12:35.477 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:35.477 00:07:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:35.477 00:07:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:35.477 00:07:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:35.477 00:07:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:35.477 00:07:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:35.477 00:07:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:35.736 00:07:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:35.996 [ 00:12:35.996 { 00:12:35.996 "name": "BaseBdev2", 00:12:35.996 "aliases": [ 00:12:35.996 "59f4a8b6-4c79-49c8-a196-e8b668a94ff2" 00:12:35.996 ], 00:12:35.996 "product_name": "Malloc disk", 00:12:35.996 "block_size": 512, 00:12:35.996 "num_blocks": 65536, 00:12:35.996 "uuid": "59f4a8b6-4c79-49c8-a196-e8b668a94ff2", 00:12:35.996 "assigned_rate_limits": { 00:12:35.996 "rw_ios_per_sec": 0, 00:12:35.996 "rw_mbytes_per_sec": 0, 00:12:35.996 "r_mbytes_per_sec": 0, 00:12:35.996 "w_mbytes_per_sec": 0 00:12:35.996 }, 00:12:35.996 "claimed": true, 00:12:35.996 "claim_type": "exclusive_write", 00:12:35.996 "zoned": false, 00:12:35.996 "supported_io_types": { 00:12:35.996 "read": true, 00:12:35.996 "write": true, 00:12:35.996 "unmap": true, 00:12:35.996 "flush": true, 00:12:35.996 "reset": true, 00:12:35.996 "nvme_admin": false, 00:12:35.996 "nvme_io": false, 00:12:35.996 "nvme_io_md": false, 00:12:35.996 "write_zeroes": true, 00:12:35.996 "zcopy": true, 00:12:35.996 "get_zone_info": false, 00:12:35.996 "zone_management": false, 00:12:35.996 "zone_append": false, 00:12:35.996 "compare": false, 00:12:35.996 "compare_and_write": false, 00:12:35.996 "abort": true, 00:12:35.996 "seek_hole": false, 00:12:35.996 "seek_data": false, 00:12:35.996 "copy": true, 00:12:35.996 "nvme_iov_md": false 00:12:35.996 }, 00:12:35.996 "memory_domains": [ 00:12:35.996 { 00:12:35.996 "dma_device_id": "system", 00:12:35.996 "dma_device_type": 1 00:12:35.996 }, 00:12:35.996 { 00:12:35.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.996 "dma_device_type": 2 00:12:35.996 } 00:12:35.996 ], 00:12:35.996 "driver_specific": {} 00:12:35.996 } 00:12:35.996 ] 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:35.996 "name": "Existed_Raid", 00:12:35.996 "uuid": "d97cd992-360c-4509-b9bb-6c0fe2d5a6a8", 00:12:35.996 "strip_size_kb": 0, 00:12:35.996 "state": "online", 00:12:35.996 "raid_level": "raid1", 00:12:35.996 "superblock": true, 00:12:35.996 "num_base_bdevs": 2, 00:12:35.996 "num_base_bdevs_discovered": 2, 00:12:35.996 "num_base_bdevs_operational": 2, 00:12:35.996 "base_bdevs_list": [ 00:12:35.996 { 00:12:35.996 "name": "BaseBdev1", 00:12:35.996 "uuid": "369aea59-cf10-4459-8e2b-2a478db0f434", 00:12:35.996 "is_configured": true, 00:12:35.996 "data_offset": 2048, 00:12:35.996 "data_size": 63488 00:12:35.996 }, 00:12:35.996 { 00:12:35.996 "name": "BaseBdev2", 00:12:35.996 "uuid": "59f4a8b6-4c79-49c8-a196-e8b668a94ff2", 00:12:35.996 "is_configured": true, 00:12:35.996 "data_offset": 2048, 00:12:35.996 "data_size": 63488 00:12:35.996 } 00:12:35.996 ] 00:12:35.996 }' 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:35.996 00:07:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:36.933 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:36.933 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:36.933 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:36.933 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:36.933 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:36.933 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:36.933 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:36.933 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:36.933 [2024-07-16 00:07:23.763050] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:36.933 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:36.933 "name": "Existed_Raid", 00:12:36.933 "aliases": [ 00:12:36.933 "d97cd992-360c-4509-b9bb-6c0fe2d5a6a8" 00:12:36.933 ], 00:12:36.933 "product_name": "Raid Volume", 00:12:36.933 "block_size": 512, 00:12:36.933 "num_blocks": 63488, 00:12:36.933 "uuid": "d97cd992-360c-4509-b9bb-6c0fe2d5a6a8", 00:12:36.933 "assigned_rate_limits": { 00:12:36.933 "rw_ios_per_sec": 0, 00:12:36.933 "rw_mbytes_per_sec": 0, 00:12:36.933 "r_mbytes_per_sec": 0, 00:12:36.933 "w_mbytes_per_sec": 0 00:12:36.933 }, 00:12:36.933 "claimed": false, 00:12:36.933 "zoned": false, 00:12:36.933 "supported_io_types": { 00:12:36.933 "read": true, 00:12:36.933 "write": true, 00:12:36.933 "unmap": false, 00:12:36.933 "flush": false, 00:12:36.933 "reset": true, 00:12:36.933 "nvme_admin": false, 00:12:36.933 "nvme_io": false, 00:12:36.933 "nvme_io_md": false, 00:12:36.933 "write_zeroes": true, 00:12:36.933 "zcopy": false, 00:12:36.933 "get_zone_info": false, 00:12:36.933 "zone_management": false, 00:12:36.933 "zone_append": false, 00:12:36.933 "compare": false, 00:12:36.933 "compare_and_write": false, 00:12:36.933 "abort": false, 00:12:36.933 "seek_hole": false, 00:12:36.933 "seek_data": false, 00:12:36.933 "copy": false, 00:12:36.933 "nvme_iov_md": false 00:12:36.933 }, 00:12:36.933 "memory_domains": [ 00:12:36.933 { 00:12:36.933 "dma_device_id": "system", 00:12:36.933 "dma_device_type": 1 00:12:36.933 }, 00:12:36.933 { 00:12:36.933 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.933 "dma_device_type": 2 00:12:36.933 }, 00:12:36.933 { 00:12:36.933 "dma_device_id": "system", 00:12:36.933 "dma_device_type": 1 00:12:36.933 }, 00:12:36.933 { 00:12:36.933 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.933 "dma_device_type": 2 00:12:36.933 } 00:12:36.933 ], 00:12:36.933 "driver_specific": { 00:12:36.933 "raid": { 00:12:36.933 "uuid": "d97cd992-360c-4509-b9bb-6c0fe2d5a6a8", 00:12:36.933 "strip_size_kb": 0, 00:12:36.933 "state": "online", 00:12:36.933 "raid_level": "raid1", 00:12:36.933 "superblock": true, 00:12:36.933 "num_base_bdevs": 2, 00:12:36.933 "num_base_bdevs_discovered": 2, 00:12:36.933 "num_base_bdevs_operational": 2, 00:12:36.933 "base_bdevs_list": [ 00:12:36.933 { 00:12:36.933 "name": "BaseBdev1", 00:12:36.933 "uuid": "369aea59-cf10-4459-8e2b-2a478db0f434", 00:12:36.933 "is_configured": true, 00:12:36.934 "data_offset": 2048, 00:12:36.934 "data_size": 63488 00:12:36.934 }, 00:12:36.934 { 00:12:36.934 "name": "BaseBdev2", 00:12:36.934 "uuid": "59f4a8b6-4c79-49c8-a196-e8b668a94ff2", 00:12:36.934 "is_configured": true, 00:12:36.934 "data_offset": 2048, 00:12:36.934 "data_size": 63488 00:12:36.934 } 00:12:36.934 ] 00:12:36.934 } 00:12:36.934 } 00:12:36.934 }' 00:12:36.934 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:36.934 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:36.934 BaseBdev2' 00:12:36.934 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:36.934 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:36.934 00:07:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:37.192 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:37.192 "name": "BaseBdev1", 00:12:37.192 "aliases": [ 00:12:37.192 "369aea59-cf10-4459-8e2b-2a478db0f434" 00:12:37.192 ], 00:12:37.192 "product_name": "Malloc disk", 00:12:37.192 "block_size": 512, 00:12:37.192 "num_blocks": 65536, 00:12:37.192 "uuid": "369aea59-cf10-4459-8e2b-2a478db0f434", 00:12:37.192 "assigned_rate_limits": { 00:12:37.192 "rw_ios_per_sec": 0, 00:12:37.192 "rw_mbytes_per_sec": 0, 00:12:37.192 "r_mbytes_per_sec": 0, 00:12:37.192 "w_mbytes_per_sec": 0 00:12:37.192 }, 00:12:37.192 "claimed": true, 00:12:37.192 "claim_type": "exclusive_write", 00:12:37.192 "zoned": false, 00:12:37.192 "supported_io_types": { 00:12:37.192 "read": true, 00:12:37.192 "write": true, 00:12:37.192 "unmap": true, 00:12:37.192 "flush": true, 00:12:37.192 "reset": true, 00:12:37.192 "nvme_admin": false, 00:12:37.192 "nvme_io": false, 00:12:37.192 "nvme_io_md": false, 00:12:37.192 "write_zeroes": true, 00:12:37.192 "zcopy": true, 00:12:37.192 "get_zone_info": false, 00:12:37.192 "zone_management": false, 00:12:37.192 "zone_append": false, 00:12:37.192 "compare": false, 00:12:37.192 "compare_and_write": false, 00:12:37.192 "abort": true, 00:12:37.192 "seek_hole": false, 00:12:37.192 "seek_data": false, 00:12:37.192 "copy": true, 00:12:37.192 "nvme_iov_md": false 00:12:37.192 }, 00:12:37.192 "memory_domains": [ 00:12:37.192 { 00:12:37.192 "dma_device_id": "system", 00:12:37.192 "dma_device_type": 1 00:12:37.192 }, 00:12:37.192 { 00:12:37.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.192 "dma_device_type": 2 00:12:37.192 } 00:12:37.192 ], 00:12:37.192 "driver_specific": {} 00:12:37.192 }' 00:12:37.192 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.192 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.471 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:37.471 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.471 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.471 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:37.471 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.471 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.471 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:37.471 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:37.730 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:37.730 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:37.730 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:37.730 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:37.730 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:37.988 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:37.988 "name": "BaseBdev2", 00:12:37.988 "aliases": [ 00:12:37.988 "59f4a8b6-4c79-49c8-a196-e8b668a94ff2" 00:12:37.988 ], 00:12:37.988 "product_name": "Malloc disk", 00:12:37.988 "block_size": 512, 00:12:37.988 "num_blocks": 65536, 00:12:37.988 "uuid": "59f4a8b6-4c79-49c8-a196-e8b668a94ff2", 00:12:37.988 "assigned_rate_limits": { 00:12:37.988 "rw_ios_per_sec": 0, 00:12:37.988 "rw_mbytes_per_sec": 0, 00:12:37.988 "r_mbytes_per_sec": 0, 00:12:37.988 "w_mbytes_per_sec": 0 00:12:37.988 }, 00:12:37.988 "claimed": true, 00:12:37.988 "claim_type": "exclusive_write", 00:12:37.988 "zoned": false, 00:12:37.988 "supported_io_types": { 00:12:37.988 "read": true, 00:12:37.988 "write": true, 00:12:37.988 "unmap": true, 00:12:37.988 "flush": true, 00:12:37.988 "reset": true, 00:12:37.988 "nvme_admin": false, 00:12:37.988 "nvme_io": false, 00:12:37.988 "nvme_io_md": false, 00:12:37.988 "write_zeroes": true, 00:12:37.988 "zcopy": true, 00:12:37.988 "get_zone_info": false, 00:12:37.988 "zone_management": false, 00:12:37.988 "zone_append": false, 00:12:37.988 "compare": false, 00:12:37.988 "compare_and_write": false, 00:12:37.988 "abort": true, 00:12:37.988 "seek_hole": false, 00:12:37.988 "seek_data": false, 00:12:37.988 "copy": true, 00:12:37.988 "nvme_iov_md": false 00:12:37.988 }, 00:12:37.988 "memory_domains": [ 00:12:37.988 { 00:12:37.988 "dma_device_id": "system", 00:12:37.988 "dma_device_type": 1 00:12:37.988 }, 00:12:37.988 { 00:12:37.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.988 "dma_device_type": 2 00:12:37.988 } 00:12:37.988 ], 00:12:37.988 "driver_specific": {} 00:12:37.988 }' 00:12:37.988 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.988 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.988 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:37.988 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.988 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.988 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:37.988 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.988 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.245 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:38.245 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.245 00:07:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.245 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:38.245 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:38.502 [2024-07-16 00:07:25.246814] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.502 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.759 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.759 "name": "Existed_Raid", 00:12:38.759 "uuid": "d97cd992-360c-4509-b9bb-6c0fe2d5a6a8", 00:12:38.760 "strip_size_kb": 0, 00:12:38.760 "state": "online", 00:12:38.760 "raid_level": "raid1", 00:12:38.760 "superblock": true, 00:12:38.760 "num_base_bdevs": 2, 00:12:38.760 "num_base_bdevs_discovered": 1, 00:12:38.760 "num_base_bdevs_operational": 1, 00:12:38.760 "base_bdevs_list": [ 00:12:38.760 { 00:12:38.760 "name": null, 00:12:38.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.760 "is_configured": false, 00:12:38.760 "data_offset": 2048, 00:12:38.760 "data_size": 63488 00:12:38.760 }, 00:12:38.760 { 00:12:38.760 "name": "BaseBdev2", 00:12:38.760 "uuid": "59f4a8b6-4c79-49c8-a196-e8b668a94ff2", 00:12:38.760 "is_configured": true, 00:12:38.760 "data_offset": 2048, 00:12:38.760 "data_size": 63488 00:12:38.760 } 00:12:38.760 ] 00:12:38.760 }' 00:12:38.760 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.760 00:07:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:39.016 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:39.016 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:39.016 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.016 00:07:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:39.274 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:39.274 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:39.274 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:39.532 [2024-07-16 00:07:26.346819] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:39.532 [2024-07-16 00:07:26.346910] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:39.532 [2024-07-16 00:07:26.359694] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:39.532 [2024-07-16 00:07:26.359733] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:39.532 [2024-07-16 00:07:26.359745] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2675000 name Existed_Raid, state offline 00:12:39.532 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:39.532 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:39.532 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.532 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3503068 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3503068 ']' 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3503068 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3503068 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3503068' 00:12:39.790 killing process with pid 3503068 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3503068 00:12:39.790 [2024-07-16 00:07:26.674955] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:39.790 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3503068 00:12:39.790 [2024-07-16 00:07:26.675852] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:40.048 00:07:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:40.048 00:12:40.048 real 0m10.027s 00:12:40.048 user 0m17.753s 00:12:40.048 sys 0m1.949s 00:12:40.048 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:40.048 00:07:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:40.048 ************************************ 00:12:40.048 END TEST raid_state_function_test_sb 00:12:40.048 ************************************ 00:12:40.048 00:07:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:40.048 00:07:26 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:12:40.048 00:07:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:40.048 00:07:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:40.048 00:07:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:40.048 ************************************ 00:12:40.048 START TEST raid_superblock_test 00:12:40.048 ************************************ 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3504611 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3504611 /var/tmp/spdk-raid.sock 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3504611 ']' 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:40.048 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:40.048 00:07:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:40.049 00:07:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.307 [2024-07-16 00:07:27.031840] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:12:40.307 [2024-07-16 00:07:27.031907] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3504611 ] 00:12:40.307 [2024-07-16 00:07:27.149655] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:40.307 [2024-07-16 00:07:27.251665] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:40.565 [2024-07-16 00:07:27.319114] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:40.565 [2024-07-16 00:07:27.319151] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:41.131 00:07:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:41.391 malloc1 00:12:41.391 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:41.650 [2024-07-16 00:07:28.434102] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:41.650 [2024-07-16 00:07:28.434149] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:41.650 [2024-07-16 00:07:28.434168] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf41570 00:12:41.650 [2024-07-16 00:07:28.434181] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:41.650 [2024-07-16 00:07:28.435847] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:41.650 [2024-07-16 00:07:28.435878] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:41.650 pt1 00:12:41.650 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:41.650 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:41.650 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:41.650 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:41.650 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:41.650 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:41.650 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:41.650 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:41.650 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:41.909 malloc2 00:12:41.909 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:42.169 [2024-07-16 00:07:28.928438] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:42.169 [2024-07-16 00:07:28.928490] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:42.169 [2024-07-16 00:07:28.928508] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf42970 00:12:42.169 [2024-07-16 00:07:28.928521] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:42.169 [2024-07-16 00:07:28.930195] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:42.169 [2024-07-16 00:07:28.930224] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:42.169 pt2 00:12:42.169 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:42.169 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:42.169 00:07:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:42.428 [2024-07-16 00:07:29.173102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:42.428 [2024-07-16 00:07:29.174474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:42.428 [2024-07-16 00:07:29.174629] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10e5270 00:12:42.428 [2024-07-16 00:07:29.174643] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:42.428 [2024-07-16 00:07:29.174852] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf390e0 00:12:42.428 [2024-07-16 00:07:29.175019] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10e5270 00:12:42.428 [2024-07-16 00:07:29.175030] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10e5270 00:12:42.428 [2024-07-16 00:07:29.175133] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.428 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:42.701 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.701 "name": "raid_bdev1", 00:12:42.701 "uuid": "4f2556bf-8c20-46c9-876f-b00e8aec80fb", 00:12:42.701 "strip_size_kb": 0, 00:12:42.701 "state": "online", 00:12:42.701 "raid_level": "raid1", 00:12:42.701 "superblock": true, 00:12:42.701 "num_base_bdevs": 2, 00:12:42.701 "num_base_bdevs_discovered": 2, 00:12:42.701 "num_base_bdevs_operational": 2, 00:12:42.701 "base_bdevs_list": [ 00:12:42.701 { 00:12:42.701 "name": "pt1", 00:12:42.701 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:42.701 "is_configured": true, 00:12:42.701 "data_offset": 2048, 00:12:42.701 "data_size": 63488 00:12:42.701 }, 00:12:42.701 { 00:12:42.701 "name": "pt2", 00:12:42.701 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:42.701 "is_configured": true, 00:12:42.701 "data_offset": 2048, 00:12:42.701 "data_size": 63488 00:12:42.701 } 00:12:42.701 ] 00:12:42.701 }' 00:12:42.701 00:07:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.701 00:07:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:43.270 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:43.270 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:43.270 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:43.270 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:43.270 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:43.270 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:43.270 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:43.270 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:43.528 [2024-07-16 00:07:30.256200] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:43.528 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:43.528 "name": "raid_bdev1", 00:12:43.528 "aliases": [ 00:12:43.528 "4f2556bf-8c20-46c9-876f-b00e8aec80fb" 00:12:43.528 ], 00:12:43.528 "product_name": "Raid Volume", 00:12:43.528 "block_size": 512, 00:12:43.528 "num_blocks": 63488, 00:12:43.528 "uuid": "4f2556bf-8c20-46c9-876f-b00e8aec80fb", 00:12:43.528 "assigned_rate_limits": { 00:12:43.528 "rw_ios_per_sec": 0, 00:12:43.529 "rw_mbytes_per_sec": 0, 00:12:43.529 "r_mbytes_per_sec": 0, 00:12:43.529 "w_mbytes_per_sec": 0 00:12:43.529 }, 00:12:43.529 "claimed": false, 00:12:43.529 "zoned": false, 00:12:43.529 "supported_io_types": { 00:12:43.529 "read": true, 00:12:43.529 "write": true, 00:12:43.529 "unmap": false, 00:12:43.529 "flush": false, 00:12:43.529 "reset": true, 00:12:43.529 "nvme_admin": false, 00:12:43.529 "nvme_io": false, 00:12:43.529 "nvme_io_md": false, 00:12:43.529 "write_zeroes": true, 00:12:43.529 "zcopy": false, 00:12:43.529 "get_zone_info": false, 00:12:43.529 "zone_management": false, 00:12:43.529 "zone_append": false, 00:12:43.529 "compare": false, 00:12:43.529 "compare_and_write": false, 00:12:43.529 "abort": false, 00:12:43.529 "seek_hole": false, 00:12:43.529 "seek_data": false, 00:12:43.529 "copy": false, 00:12:43.529 "nvme_iov_md": false 00:12:43.529 }, 00:12:43.529 "memory_domains": [ 00:12:43.529 { 00:12:43.529 "dma_device_id": "system", 00:12:43.529 "dma_device_type": 1 00:12:43.529 }, 00:12:43.529 { 00:12:43.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.529 "dma_device_type": 2 00:12:43.529 }, 00:12:43.529 { 00:12:43.529 "dma_device_id": "system", 00:12:43.529 "dma_device_type": 1 00:12:43.529 }, 00:12:43.529 { 00:12:43.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.529 "dma_device_type": 2 00:12:43.529 } 00:12:43.529 ], 00:12:43.529 "driver_specific": { 00:12:43.529 "raid": { 00:12:43.529 "uuid": "4f2556bf-8c20-46c9-876f-b00e8aec80fb", 00:12:43.529 "strip_size_kb": 0, 00:12:43.529 "state": "online", 00:12:43.529 "raid_level": "raid1", 00:12:43.529 "superblock": true, 00:12:43.529 "num_base_bdevs": 2, 00:12:43.529 "num_base_bdevs_discovered": 2, 00:12:43.529 "num_base_bdevs_operational": 2, 00:12:43.529 "base_bdevs_list": [ 00:12:43.529 { 00:12:43.529 "name": "pt1", 00:12:43.529 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:43.529 "is_configured": true, 00:12:43.529 "data_offset": 2048, 00:12:43.529 "data_size": 63488 00:12:43.529 }, 00:12:43.529 { 00:12:43.529 "name": "pt2", 00:12:43.529 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:43.529 "is_configured": true, 00:12:43.529 "data_offset": 2048, 00:12:43.529 "data_size": 63488 00:12:43.529 } 00:12:43.529 ] 00:12:43.529 } 00:12:43.529 } 00:12:43.529 }' 00:12:43.529 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:43.529 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:43.529 pt2' 00:12:43.529 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:43.529 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:43.529 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:43.788 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:43.788 "name": "pt1", 00:12:43.788 "aliases": [ 00:12:43.788 "00000000-0000-0000-0000-000000000001" 00:12:43.788 ], 00:12:43.788 "product_name": "passthru", 00:12:43.788 "block_size": 512, 00:12:43.788 "num_blocks": 65536, 00:12:43.788 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:43.788 "assigned_rate_limits": { 00:12:43.788 "rw_ios_per_sec": 0, 00:12:43.788 "rw_mbytes_per_sec": 0, 00:12:43.788 "r_mbytes_per_sec": 0, 00:12:43.788 "w_mbytes_per_sec": 0 00:12:43.788 }, 00:12:43.788 "claimed": true, 00:12:43.788 "claim_type": "exclusive_write", 00:12:43.788 "zoned": false, 00:12:43.788 "supported_io_types": { 00:12:43.788 "read": true, 00:12:43.788 "write": true, 00:12:43.788 "unmap": true, 00:12:43.788 "flush": true, 00:12:43.788 "reset": true, 00:12:43.788 "nvme_admin": false, 00:12:43.788 "nvme_io": false, 00:12:43.788 "nvme_io_md": false, 00:12:43.788 "write_zeroes": true, 00:12:43.788 "zcopy": true, 00:12:43.788 "get_zone_info": false, 00:12:43.788 "zone_management": false, 00:12:43.788 "zone_append": false, 00:12:43.788 "compare": false, 00:12:43.788 "compare_and_write": false, 00:12:43.788 "abort": true, 00:12:43.788 "seek_hole": false, 00:12:43.788 "seek_data": false, 00:12:43.788 "copy": true, 00:12:43.788 "nvme_iov_md": false 00:12:43.788 }, 00:12:43.788 "memory_domains": [ 00:12:43.788 { 00:12:43.788 "dma_device_id": "system", 00:12:43.788 "dma_device_type": 1 00:12:43.788 }, 00:12:43.788 { 00:12:43.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.788 "dma_device_type": 2 00:12:43.788 } 00:12:43.788 ], 00:12:43.788 "driver_specific": { 00:12:43.788 "passthru": { 00:12:43.788 "name": "pt1", 00:12:43.788 "base_bdev_name": "malloc1" 00:12:43.788 } 00:12:43.788 } 00:12:43.788 }' 00:12:43.788 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.788 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.788 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:43.788 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.788 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.081 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:44.081 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.081 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.081 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:44.081 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.081 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.081 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:44.081 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:44.081 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:44.081 00:07:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:44.365 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:44.365 "name": "pt2", 00:12:44.365 "aliases": [ 00:12:44.365 "00000000-0000-0000-0000-000000000002" 00:12:44.365 ], 00:12:44.365 "product_name": "passthru", 00:12:44.365 "block_size": 512, 00:12:44.365 "num_blocks": 65536, 00:12:44.365 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:44.365 "assigned_rate_limits": { 00:12:44.365 "rw_ios_per_sec": 0, 00:12:44.365 "rw_mbytes_per_sec": 0, 00:12:44.365 "r_mbytes_per_sec": 0, 00:12:44.365 "w_mbytes_per_sec": 0 00:12:44.365 }, 00:12:44.365 "claimed": true, 00:12:44.365 "claim_type": "exclusive_write", 00:12:44.365 "zoned": false, 00:12:44.365 "supported_io_types": { 00:12:44.365 "read": true, 00:12:44.365 "write": true, 00:12:44.365 "unmap": true, 00:12:44.365 "flush": true, 00:12:44.365 "reset": true, 00:12:44.365 "nvme_admin": false, 00:12:44.365 "nvme_io": false, 00:12:44.365 "nvme_io_md": false, 00:12:44.365 "write_zeroes": true, 00:12:44.365 "zcopy": true, 00:12:44.365 "get_zone_info": false, 00:12:44.365 "zone_management": false, 00:12:44.365 "zone_append": false, 00:12:44.365 "compare": false, 00:12:44.365 "compare_and_write": false, 00:12:44.365 "abort": true, 00:12:44.365 "seek_hole": false, 00:12:44.365 "seek_data": false, 00:12:44.365 "copy": true, 00:12:44.365 "nvme_iov_md": false 00:12:44.365 }, 00:12:44.365 "memory_domains": [ 00:12:44.365 { 00:12:44.365 "dma_device_id": "system", 00:12:44.365 "dma_device_type": 1 00:12:44.365 }, 00:12:44.365 { 00:12:44.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.365 "dma_device_type": 2 00:12:44.365 } 00:12:44.365 ], 00:12:44.365 "driver_specific": { 00:12:44.365 "passthru": { 00:12:44.365 "name": "pt2", 00:12:44.365 "base_bdev_name": "malloc2" 00:12:44.365 } 00:12:44.365 } 00:12:44.365 }' 00:12:44.365 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.365 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.365 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:44.365 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.623 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.623 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:44.623 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.623 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.623 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:44.623 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.623 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.623 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:44.623 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:44.623 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:44.882 [2024-07-16 00:07:31.780207] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:44.882 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=4f2556bf-8c20-46c9-876f-b00e8aec80fb 00:12:44.882 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 4f2556bf-8c20-46c9-876f-b00e8aec80fb ']' 00:12:44.882 00:07:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:45.450 [2024-07-16 00:07:32.285308] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:45.450 [2024-07-16 00:07:32.285334] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:45.450 [2024-07-16 00:07:32.285387] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:45.450 [2024-07-16 00:07:32.285442] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:45.450 [2024-07-16 00:07:32.285454] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10e5270 name raid_bdev1, state offline 00:12:45.450 00:07:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.450 00:07:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:45.709 00:07:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:45.710 00:07:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:45.710 00:07:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:45.710 00:07:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:45.968 00:07:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:45.968 00:07:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:46.227 00:07:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:46.227 00:07:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:46.486 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:46.745 [2024-07-16 00:07:33.532545] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:46.745 [2024-07-16 00:07:33.533941] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:46.745 [2024-07-16 00:07:33.533995] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:46.746 [2024-07-16 00:07:33.534033] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:46.746 [2024-07-16 00:07:33.534052] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:46.746 [2024-07-16 00:07:33.534062] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10e4ff0 name raid_bdev1, state configuring 00:12:46.746 request: 00:12:46.746 { 00:12:46.746 "name": "raid_bdev1", 00:12:46.746 "raid_level": "raid1", 00:12:46.746 "base_bdevs": [ 00:12:46.746 "malloc1", 00:12:46.746 "malloc2" 00:12:46.746 ], 00:12:46.746 "superblock": false, 00:12:46.746 "method": "bdev_raid_create", 00:12:46.746 "req_id": 1 00:12:46.746 } 00:12:46.746 Got JSON-RPC error response 00:12:46.746 response: 00:12:46.746 { 00:12:46.746 "code": -17, 00:12:46.746 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:46.746 } 00:12:46.746 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:46.746 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:46.746 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:46.746 00:07:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:46.746 00:07:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.746 00:07:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:47.004 00:07:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:47.004 00:07:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:47.004 00:07:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:47.263 [2024-07-16 00:07:34.025788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:47.263 [2024-07-16 00:07:34.025830] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:47.263 [2024-07-16 00:07:34.025850] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf417a0 00:12:47.263 [2024-07-16 00:07:34.025863] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:47.263 [2024-07-16 00:07:34.027701] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:47.263 [2024-07-16 00:07:34.027732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:47.263 [2024-07-16 00:07:34.027808] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:47.263 [2024-07-16 00:07:34.027832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:47.263 pt1 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.263 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:47.521 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:47.521 "name": "raid_bdev1", 00:12:47.521 "uuid": "4f2556bf-8c20-46c9-876f-b00e8aec80fb", 00:12:47.521 "strip_size_kb": 0, 00:12:47.521 "state": "configuring", 00:12:47.521 "raid_level": "raid1", 00:12:47.521 "superblock": true, 00:12:47.521 "num_base_bdevs": 2, 00:12:47.521 "num_base_bdevs_discovered": 1, 00:12:47.521 "num_base_bdevs_operational": 2, 00:12:47.521 "base_bdevs_list": [ 00:12:47.521 { 00:12:47.521 "name": "pt1", 00:12:47.521 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:47.521 "is_configured": true, 00:12:47.521 "data_offset": 2048, 00:12:47.521 "data_size": 63488 00:12:47.521 }, 00:12:47.521 { 00:12:47.521 "name": null, 00:12:47.521 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:47.521 "is_configured": false, 00:12:47.521 "data_offset": 2048, 00:12:47.521 "data_size": 63488 00:12:47.521 } 00:12:47.521 ] 00:12:47.521 }' 00:12:47.521 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:47.521 00:07:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:48.085 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:48.085 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:48.085 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:48.085 00:07:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:48.343 [2024-07-16 00:07:35.116694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:48.343 [2024-07-16 00:07:35.116740] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:48.343 [2024-07-16 00:07:35.116759] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10d96f0 00:12:48.343 [2024-07-16 00:07:35.116772] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:48.343 [2024-07-16 00:07:35.117126] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:48.343 [2024-07-16 00:07:35.117146] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:48.343 [2024-07-16 00:07:35.117206] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:48.343 [2024-07-16 00:07:35.117225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:48.343 [2024-07-16 00:07:35.117321] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10da590 00:12:48.343 [2024-07-16 00:07:35.117332] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:48.343 [2024-07-16 00:07:35.117498] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf3b540 00:12:48.343 [2024-07-16 00:07:35.117621] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10da590 00:12:48.343 [2024-07-16 00:07:35.117631] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10da590 00:12:48.343 [2024-07-16 00:07:35.117724] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:48.343 pt2 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.343 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:48.602 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.602 "name": "raid_bdev1", 00:12:48.602 "uuid": "4f2556bf-8c20-46c9-876f-b00e8aec80fb", 00:12:48.602 "strip_size_kb": 0, 00:12:48.602 "state": "online", 00:12:48.602 "raid_level": "raid1", 00:12:48.602 "superblock": true, 00:12:48.602 "num_base_bdevs": 2, 00:12:48.602 "num_base_bdevs_discovered": 2, 00:12:48.602 "num_base_bdevs_operational": 2, 00:12:48.602 "base_bdevs_list": [ 00:12:48.602 { 00:12:48.602 "name": "pt1", 00:12:48.602 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:48.602 "is_configured": true, 00:12:48.602 "data_offset": 2048, 00:12:48.602 "data_size": 63488 00:12:48.602 }, 00:12:48.602 { 00:12:48.602 "name": "pt2", 00:12:48.602 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:48.602 "is_configured": true, 00:12:48.602 "data_offset": 2048, 00:12:48.602 "data_size": 63488 00:12:48.602 } 00:12:48.602 ] 00:12:48.602 }' 00:12:48.602 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.602 00:07:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.168 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:49.168 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:49.168 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:49.168 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:49.168 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:49.168 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:49.168 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:49.168 00:07:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:49.426 [2024-07-16 00:07:36.199806] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:49.426 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:49.426 "name": "raid_bdev1", 00:12:49.426 "aliases": [ 00:12:49.426 "4f2556bf-8c20-46c9-876f-b00e8aec80fb" 00:12:49.426 ], 00:12:49.426 "product_name": "Raid Volume", 00:12:49.426 "block_size": 512, 00:12:49.426 "num_blocks": 63488, 00:12:49.426 "uuid": "4f2556bf-8c20-46c9-876f-b00e8aec80fb", 00:12:49.426 "assigned_rate_limits": { 00:12:49.427 "rw_ios_per_sec": 0, 00:12:49.427 "rw_mbytes_per_sec": 0, 00:12:49.427 "r_mbytes_per_sec": 0, 00:12:49.427 "w_mbytes_per_sec": 0 00:12:49.427 }, 00:12:49.427 "claimed": false, 00:12:49.427 "zoned": false, 00:12:49.427 "supported_io_types": { 00:12:49.427 "read": true, 00:12:49.427 "write": true, 00:12:49.427 "unmap": false, 00:12:49.427 "flush": false, 00:12:49.427 "reset": true, 00:12:49.427 "nvme_admin": false, 00:12:49.427 "nvme_io": false, 00:12:49.427 "nvme_io_md": false, 00:12:49.427 "write_zeroes": true, 00:12:49.427 "zcopy": false, 00:12:49.427 "get_zone_info": false, 00:12:49.427 "zone_management": false, 00:12:49.427 "zone_append": false, 00:12:49.427 "compare": false, 00:12:49.427 "compare_and_write": false, 00:12:49.427 "abort": false, 00:12:49.427 "seek_hole": false, 00:12:49.427 "seek_data": false, 00:12:49.427 "copy": false, 00:12:49.427 "nvme_iov_md": false 00:12:49.427 }, 00:12:49.427 "memory_domains": [ 00:12:49.427 { 00:12:49.427 "dma_device_id": "system", 00:12:49.427 "dma_device_type": 1 00:12:49.427 }, 00:12:49.427 { 00:12:49.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.427 "dma_device_type": 2 00:12:49.427 }, 00:12:49.427 { 00:12:49.427 "dma_device_id": "system", 00:12:49.427 "dma_device_type": 1 00:12:49.427 }, 00:12:49.427 { 00:12:49.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.427 "dma_device_type": 2 00:12:49.427 } 00:12:49.427 ], 00:12:49.427 "driver_specific": { 00:12:49.427 "raid": { 00:12:49.427 "uuid": "4f2556bf-8c20-46c9-876f-b00e8aec80fb", 00:12:49.427 "strip_size_kb": 0, 00:12:49.427 "state": "online", 00:12:49.427 "raid_level": "raid1", 00:12:49.427 "superblock": true, 00:12:49.427 "num_base_bdevs": 2, 00:12:49.427 "num_base_bdevs_discovered": 2, 00:12:49.427 "num_base_bdevs_operational": 2, 00:12:49.427 "base_bdevs_list": [ 00:12:49.427 { 00:12:49.427 "name": "pt1", 00:12:49.427 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:49.427 "is_configured": true, 00:12:49.427 "data_offset": 2048, 00:12:49.427 "data_size": 63488 00:12:49.427 }, 00:12:49.427 { 00:12:49.427 "name": "pt2", 00:12:49.427 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:49.427 "is_configured": true, 00:12:49.427 "data_offset": 2048, 00:12:49.427 "data_size": 63488 00:12:49.427 } 00:12:49.427 ] 00:12:49.427 } 00:12:49.427 } 00:12:49.427 }' 00:12:49.427 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:49.427 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:49.427 pt2' 00:12:49.427 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:49.427 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:49.427 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:49.685 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:49.685 "name": "pt1", 00:12:49.685 "aliases": [ 00:12:49.685 "00000000-0000-0000-0000-000000000001" 00:12:49.685 ], 00:12:49.685 "product_name": "passthru", 00:12:49.685 "block_size": 512, 00:12:49.685 "num_blocks": 65536, 00:12:49.685 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:49.685 "assigned_rate_limits": { 00:12:49.685 "rw_ios_per_sec": 0, 00:12:49.685 "rw_mbytes_per_sec": 0, 00:12:49.685 "r_mbytes_per_sec": 0, 00:12:49.685 "w_mbytes_per_sec": 0 00:12:49.685 }, 00:12:49.685 "claimed": true, 00:12:49.685 "claim_type": "exclusive_write", 00:12:49.685 "zoned": false, 00:12:49.685 "supported_io_types": { 00:12:49.685 "read": true, 00:12:49.685 "write": true, 00:12:49.685 "unmap": true, 00:12:49.685 "flush": true, 00:12:49.685 "reset": true, 00:12:49.685 "nvme_admin": false, 00:12:49.685 "nvme_io": false, 00:12:49.685 "nvme_io_md": false, 00:12:49.685 "write_zeroes": true, 00:12:49.685 "zcopy": true, 00:12:49.685 "get_zone_info": false, 00:12:49.685 "zone_management": false, 00:12:49.685 "zone_append": false, 00:12:49.685 "compare": false, 00:12:49.685 "compare_and_write": false, 00:12:49.685 "abort": true, 00:12:49.685 "seek_hole": false, 00:12:49.685 "seek_data": false, 00:12:49.685 "copy": true, 00:12:49.685 "nvme_iov_md": false 00:12:49.685 }, 00:12:49.685 "memory_domains": [ 00:12:49.685 { 00:12:49.685 "dma_device_id": "system", 00:12:49.685 "dma_device_type": 1 00:12:49.685 }, 00:12:49.685 { 00:12:49.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.686 "dma_device_type": 2 00:12:49.686 } 00:12:49.686 ], 00:12:49.686 "driver_specific": { 00:12:49.686 "passthru": { 00:12:49.686 "name": "pt1", 00:12:49.686 "base_bdev_name": "malloc1" 00:12:49.686 } 00:12:49.686 } 00:12:49.686 }' 00:12:49.686 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:49.686 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:49.686 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:49.686 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:49.686 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:49.686 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:49.686 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:49.944 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:49.944 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:49.944 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.944 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.944 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:49.944 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:49.944 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:49.944 00:07:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:50.202 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:50.202 "name": "pt2", 00:12:50.202 "aliases": [ 00:12:50.202 "00000000-0000-0000-0000-000000000002" 00:12:50.202 ], 00:12:50.202 "product_name": "passthru", 00:12:50.202 "block_size": 512, 00:12:50.202 "num_blocks": 65536, 00:12:50.202 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:50.202 "assigned_rate_limits": { 00:12:50.202 "rw_ios_per_sec": 0, 00:12:50.202 "rw_mbytes_per_sec": 0, 00:12:50.202 "r_mbytes_per_sec": 0, 00:12:50.202 "w_mbytes_per_sec": 0 00:12:50.202 }, 00:12:50.202 "claimed": true, 00:12:50.202 "claim_type": "exclusive_write", 00:12:50.202 "zoned": false, 00:12:50.202 "supported_io_types": { 00:12:50.202 "read": true, 00:12:50.202 "write": true, 00:12:50.202 "unmap": true, 00:12:50.202 "flush": true, 00:12:50.202 "reset": true, 00:12:50.202 "nvme_admin": false, 00:12:50.202 "nvme_io": false, 00:12:50.202 "nvme_io_md": false, 00:12:50.202 "write_zeroes": true, 00:12:50.202 "zcopy": true, 00:12:50.202 "get_zone_info": false, 00:12:50.202 "zone_management": false, 00:12:50.202 "zone_append": false, 00:12:50.202 "compare": false, 00:12:50.202 "compare_and_write": false, 00:12:50.202 "abort": true, 00:12:50.202 "seek_hole": false, 00:12:50.202 "seek_data": false, 00:12:50.202 "copy": true, 00:12:50.202 "nvme_iov_md": false 00:12:50.202 }, 00:12:50.202 "memory_domains": [ 00:12:50.202 { 00:12:50.202 "dma_device_id": "system", 00:12:50.202 "dma_device_type": 1 00:12:50.202 }, 00:12:50.202 { 00:12:50.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.202 "dma_device_type": 2 00:12:50.202 } 00:12:50.202 ], 00:12:50.202 "driver_specific": { 00:12:50.202 "passthru": { 00:12:50.202 "name": "pt2", 00:12:50.202 "base_bdev_name": "malloc2" 00:12:50.202 } 00:12:50.202 } 00:12:50.202 }' 00:12:50.202 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:50.202 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:50.461 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:50.719 [2024-07-16 00:07:37.627588] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:50.719 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 4f2556bf-8c20-46c9-876f-b00e8aec80fb '!=' 4f2556bf-8c20-46c9-876f-b00e8aec80fb ']' 00:12:50.719 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:12:50.719 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:50.719 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:50.719 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:50.977 [2024-07-16 00:07:37.872010] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.977 00:07:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:51.235 00:07:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.235 "name": "raid_bdev1", 00:12:51.235 "uuid": "4f2556bf-8c20-46c9-876f-b00e8aec80fb", 00:12:51.235 "strip_size_kb": 0, 00:12:51.235 "state": "online", 00:12:51.235 "raid_level": "raid1", 00:12:51.235 "superblock": true, 00:12:51.235 "num_base_bdevs": 2, 00:12:51.235 "num_base_bdevs_discovered": 1, 00:12:51.235 "num_base_bdevs_operational": 1, 00:12:51.235 "base_bdevs_list": [ 00:12:51.235 { 00:12:51.235 "name": null, 00:12:51.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.235 "is_configured": false, 00:12:51.235 "data_offset": 2048, 00:12:51.235 "data_size": 63488 00:12:51.235 }, 00:12:51.235 { 00:12:51.235 "name": "pt2", 00:12:51.235 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:51.235 "is_configured": true, 00:12:51.235 "data_offset": 2048, 00:12:51.235 "data_size": 63488 00:12:51.235 } 00:12:51.235 ] 00:12:51.235 }' 00:12:51.235 00:07:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.235 00:07:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.169 00:07:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:52.169 [2024-07-16 00:07:38.986938] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:52.169 [2024-07-16 00:07:38.986963] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:52.169 [2024-07-16 00:07:38.987009] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:52.169 [2024-07-16 00:07:38.987048] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:52.169 [2024-07-16 00:07:38.987059] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10da590 name raid_bdev1, state offline 00:12:52.169 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.169 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:12:52.426 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:12:52.426 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:12:52.426 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:12:52.426 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:52.426 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:52.687 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:12:52.687 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:52.687 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:12:52.687 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:12:52.687 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:12:52.687 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:52.944 [2024-07-16 00:07:39.740893] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:52.944 [2024-07-16 00:07:39.740944] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:52.944 [2024-07-16 00:07:39.740962] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf42160 00:12:52.944 [2024-07-16 00:07:39.740974] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:52.944 [2024-07-16 00:07:39.742558] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:52.944 [2024-07-16 00:07:39.742592] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:52.944 [2024-07-16 00:07:39.742656] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:52.944 [2024-07-16 00:07:39.742680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:52.944 [2024-07-16 00:07:39.742764] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf38380 00:12:52.944 [2024-07-16 00:07:39.742774] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:52.944 [2024-07-16 00:07:39.742956] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf39a80 00:12:52.944 [2024-07-16 00:07:39.743077] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf38380 00:12:52.944 [2024-07-16 00:07:39.743087] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf38380 00:12:52.944 [2024-07-16 00:07:39.743184] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:52.944 pt2 00:12:52.944 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:52.944 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:52.944 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:52.944 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:52.944 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:52.944 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:52.944 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:52.945 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:52.945 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:52.945 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:52.945 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.945 00:07:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:53.203 00:07:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.203 "name": "raid_bdev1", 00:12:53.203 "uuid": "4f2556bf-8c20-46c9-876f-b00e8aec80fb", 00:12:53.203 "strip_size_kb": 0, 00:12:53.203 "state": "online", 00:12:53.203 "raid_level": "raid1", 00:12:53.203 "superblock": true, 00:12:53.203 "num_base_bdevs": 2, 00:12:53.203 "num_base_bdevs_discovered": 1, 00:12:53.203 "num_base_bdevs_operational": 1, 00:12:53.203 "base_bdevs_list": [ 00:12:53.203 { 00:12:53.203 "name": null, 00:12:53.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.203 "is_configured": false, 00:12:53.203 "data_offset": 2048, 00:12:53.203 "data_size": 63488 00:12:53.203 }, 00:12:53.203 { 00:12:53.203 "name": "pt2", 00:12:53.203 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:53.203 "is_configured": true, 00:12:53.203 "data_offset": 2048, 00:12:53.203 "data_size": 63488 00:12:53.203 } 00:12:53.203 ] 00:12:53.203 }' 00:12:53.203 00:07:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.203 00:07:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.769 00:07:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:54.027 [2024-07-16 00:07:40.855839] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:54.027 [2024-07-16 00:07:40.855865] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:54.027 [2024-07-16 00:07:40.855919] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:54.027 [2024-07-16 00:07:40.855965] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:54.028 [2024-07-16 00:07:40.855977] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf38380 name raid_bdev1, state offline 00:12:54.028 00:07:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.028 00:07:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:12:54.285 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:12:54.285 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:12:54.285 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:12:54.285 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:54.544 [2024-07-16 00:07:41.361183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:54.544 [2024-07-16 00:07:41.361227] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:54.544 [2024-07-16 00:07:41.361245] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10e4520 00:12:54.544 [2024-07-16 00:07:41.361257] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:54.544 [2024-07-16 00:07:41.362831] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:54.544 [2024-07-16 00:07:41.362858] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:54.544 [2024-07-16 00:07:41.362919] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:54.544 [2024-07-16 00:07:41.362952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:54.544 [2024-07-16 00:07:41.363046] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:12:54.544 [2024-07-16 00:07:41.363059] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:54.544 [2024-07-16 00:07:41.363071] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf393f0 name raid_bdev1, state configuring 00:12:54.544 [2024-07-16 00:07:41.363092] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:54.544 [2024-07-16 00:07:41.363148] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf3b2b0 00:12:54.544 [2024-07-16 00:07:41.363158] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:54.544 [2024-07-16 00:07:41.363313] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf38350 00:12:54.544 [2024-07-16 00:07:41.363431] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf3b2b0 00:12:54.544 [2024-07-16 00:07:41.363440] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf3b2b0 00:12:54.544 [2024-07-16 00:07:41.363535] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:54.544 pt1 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.544 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:54.802 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.802 "name": "raid_bdev1", 00:12:54.802 "uuid": "4f2556bf-8c20-46c9-876f-b00e8aec80fb", 00:12:54.802 "strip_size_kb": 0, 00:12:54.802 "state": "online", 00:12:54.802 "raid_level": "raid1", 00:12:54.802 "superblock": true, 00:12:54.802 "num_base_bdevs": 2, 00:12:54.802 "num_base_bdevs_discovered": 1, 00:12:54.802 "num_base_bdevs_operational": 1, 00:12:54.802 "base_bdevs_list": [ 00:12:54.802 { 00:12:54.802 "name": null, 00:12:54.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:54.802 "is_configured": false, 00:12:54.802 "data_offset": 2048, 00:12:54.802 "data_size": 63488 00:12:54.802 }, 00:12:54.802 { 00:12:54.802 "name": "pt2", 00:12:54.802 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:54.802 "is_configured": true, 00:12:54.802 "data_offset": 2048, 00:12:54.802 "data_size": 63488 00:12:54.802 } 00:12:54.802 ] 00:12:54.802 }' 00:12:54.803 00:07:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.803 00:07:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.369 00:07:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:12:55.369 00:07:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:55.627 00:07:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:12:55.627 00:07:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:55.627 00:07:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:12:55.885 [2024-07-16 00:07:42.721107] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 4f2556bf-8c20-46c9-876f-b00e8aec80fb '!=' 4f2556bf-8c20-46c9-876f-b00e8aec80fb ']' 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3504611 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3504611 ']' 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3504611 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3504611 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3504611' 00:12:55.885 killing process with pid 3504611 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3504611 00:12:55.885 [2024-07-16 00:07:42.787823] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:55.885 [2024-07-16 00:07:42.787872] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:55.885 [2024-07-16 00:07:42.787911] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:55.885 [2024-07-16 00:07:42.787923] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf3b2b0 name raid_bdev1, state offline 00:12:55.885 00:07:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3504611 00:12:55.885 [2024-07-16 00:07:42.806779] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:56.144 00:07:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:56.144 00:12:56.144 real 0m16.058s 00:12:56.144 user 0m29.144s 00:12:56.144 sys 0m2.976s 00:12:56.144 00:07:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:56.144 00:07:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.144 ************************************ 00:12:56.144 END TEST raid_superblock_test 00:12:56.144 ************************************ 00:12:56.144 00:07:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:56.144 00:07:43 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:12:56.144 00:07:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:56.144 00:07:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:56.144 00:07:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:56.403 ************************************ 00:12:56.403 START TEST raid_read_error_test 00:12:56.403 ************************************ 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.cmL0N3ZhoW 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3507040 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3507040 /var/tmp/spdk-raid.sock 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3507040 ']' 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:56.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:56.403 00:07:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.403 [2024-07-16 00:07:43.191541] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:12:56.403 [2024-07-16 00:07:43.191611] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3507040 ] 00:12:56.403 [2024-07-16 00:07:43.310171] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:56.661 [2024-07-16 00:07:43.417311] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:56.661 [2024-07-16 00:07:43.484725] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:56.661 [2024-07-16 00:07:43.484763] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:57.228 00:07:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:57.228 00:07:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:57.228 00:07:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:57.228 00:07:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:57.486 BaseBdev1_malloc 00:12:57.486 00:07:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:57.745 true 00:12:57.745 00:07:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:58.003 [2024-07-16 00:07:44.856046] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:58.003 [2024-07-16 00:07:44.856093] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:58.003 [2024-07-16 00:07:44.856115] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27d10d0 00:12:58.003 [2024-07-16 00:07:44.856128] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:58.003 [2024-07-16 00:07:44.858029] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:58.003 [2024-07-16 00:07:44.858060] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:58.003 BaseBdev1 00:12:58.003 00:07:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:58.003 00:07:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:58.276 BaseBdev2_malloc 00:12:58.276 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:58.534 true 00:12:58.534 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:58.794 [2024-07-16 00:07:45.595862] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:58.794 [2024-07-16 00:07:45.595908] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:58.794 [2024-07-16 00:07:45.595934] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27d5910 00:12:58.794 [2024-07-16 00:07:45.595948] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:58.794 [2024-07-16 00:07:45.597559] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:58.794 [2024-07-16 00:07:45.597591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:58.794 BaseBdev2 00:12:58.794 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:59.120 [2024-07-16 00:07:45.844542] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:59.120 [2024-07-16 00:07:45.845812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:59.120 [2024-07-16 00:07:45.846009] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27d7320 00:12:59.120 [2024-07-16 00:07:45.846022] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:59.120 [2024-07-16 00:07:45.846209] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x263ed00 00:12:59.120 [2024-07-16 00:07:45.846362] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27d7320 00:12:59.120 [2024-07-16 00:07:45.846373] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27d7320 00:12:59.120 [2024-07-16 00:07:45.846480] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.120 00:07:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:59.378 00:07:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.378 "name": "raid_bdev1", 00:12:59.378 "uuid": "f80817c7-6925-4388-9404-4fb7f19ce0f2", 00:12:59.378 "strip_size_kb": 0, 00:12:59.378 "state": "online", 00:12:59.378 "raid_level": "raid1", 00:12:59.378 "superblock": true, 00:12:59.378 "num_base_bdevs": 2, 00:12:59.378 "num_base_bdevs_discovered": 2, 00:12:59.378 "num_base_bdevs_operational": 2, 00:12:59.378 "base_bdevs_list": [ 00:12:59.378 { 00:12:59.378 "name": "BaseBdev1", 00:12:59.378 "uuid": "e051b324-2091-5ac7-996a-4d346e533f29", 00:12:59.378 "is_configured": true, 00:12:59.378 "data_offset": 2048, 00:12:59.378 "data_size": 63488 00:12:59.378 }, 00:12:59.378 { 00:12:59.378 "name": "BaseBdev2", 00:12:59.378 "uuid": "169fc157-4c1e-5594-b30a-391699da5ee5", 00:12:59.378 "is_configured": true, 00:12:59.378 "data_offset": 2048, 00:12:59.378 "data_size": 63488 00:12:59.378 } 00:12:59.378 ] 00:12:59.378 }' 00:12:59.378 00:07:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.378 00:07:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.942 00:07:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:59.942 00:07:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:59.942 [2024-07-16 00:07:46.811404] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27d2c70 00:13:00.889 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:01.146 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.147 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.147 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.147 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.147 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.147 00:07:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:01.404 00:07:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.404 "name": "raid_bdev1", 00:13:01.404 "uuid": "f80817c7-6925-4388-9404-4fb7f19ce0f2", 00:13:01.404 "strip_size_kb": 0, 00:13:01.404 "state": "online", 00:13:01.404 "raid_level": "raid1", 00:13:01.404 "superblock": true, 00:13:01.404 "num_base_bdevs": 2, 00:13:01.404 "num_base_bdevs_discovered": 2, 00:13:01.404 "num_base_bdevs_operational": 2, 00:13:01.404 "base_bdevs_list": [ 00:13:01.404 { 00:13:01.404 "name": "BaseBdev1", 00:13:01.404 "uuid": "e051b324-2091-5ac7-996a-4d346e533f29", 00:13:01.404 "is_configured": true, 00:13:01.404 "data_offset": 2048, 00:13:01.404 "data_size": 63488 00:13:01.404 }, 00:13:01.404 { 00:13:01.404 "name": "BaseBdev2", 00:13:01.404 "uuid": "169fc157-4c1e-5594-b30a-391699da5ee5", 00:13:01.404 "is_configured": true, 00:13:01.404 "data_offset": 2048, 00:13:01.404 "data_size": 63488 00:13:01.404 } 00:13:01.404 ] 00:13:01.404 }' 00:13:01.404 00:07:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.404 00:07:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.968 00:07:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:02.226 [2024-07-16 00:07:49.050642] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:02.227 [2024-07-16 00:07:49.050679] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:02.227 [2024-07-16 00:07:49.053844] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:02.227 [2024-07-16 00:07:49.053874] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:02.227 [2024-07-16 00:07:49.053963] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:02.227 [2024-07-16 00:07:49.053975] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27d7320 name raid_bdev1, state offline 00:13:02.227 0 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3507040 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3507040 ']' 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3507040 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3507040 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3507040' 00:13:02.227 killing process with pid 3507040 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3507040 00:13:02.227 [2024-07-16 00:07:49.135398] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:02.227 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3507040 00:13:02.227 [2024-07-16 00:07:49.146324] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:02.486 00:07:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.cmL0N3ZhoW 00:13:02.486 00:07:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:02.486 00:07:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:02.486 00:07:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:02.486 00:07:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:02.486 00:07:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:02.486 00:07:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:02.486 00:07:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:02.486 00:13:02.486 real 0m6.269s 00:13:02.486 user 0m9.775s 00:13:02.486 sys 0m1.129s 00:13:02.486 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:02.486 00:07:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.486 ************************************ 00:13:02.486 END TEST raid_read_error_test 00:13:02.486 ************************************ 00:13:02.486 00:07:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:02.486 00:07:49 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:13:02.486 00:07:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:02.486 00:07:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:02.486 00:07:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:02.745 ************************************ 00:13:02.745 START TEST raid_write_error_test 00:13:02.745 ************************************ 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.9s8oC9nPNn 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3507972 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3507972 /var/tmp/spdk-raid.sock 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3507972 ']' 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:02.745 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:02.745 00:07:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.745 [2024-07-16 00:07:49.546688] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:13:02.745 [2024-07-16 00:07:49.546769] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3507972 ] 00:13:02.745 [2024-07-16 00:07:49.680291] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:03.004 [2024-07-16 00:07:49.783194] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:03.004 [2024-07-16 00:07:49.855815] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:03.004 [2024-07-16 00:07:49.855857] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:03.571 00:07:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:03.571 00:07:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:03.571 00:07:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:03.571 00:07:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:03.829 BaseBdev1_malloc 00:13:03.829 00:07:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:04.088 true 00:13:04.088 00:07:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:04.347 [2024-07-16 00:07:51.170841] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:04.347 [2024-07-16 00:07:51.170886] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:04.347 [2024-07-16 00:07:51.170907] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18cf0d0 00:13:04.347 [2024-07-16 00:07:51.170920] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:04.347 [2024-07-16 00:07:51.172668] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:04.347 [2024-07-16 00:07:51.172700] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:04.347 BaseBdev1 00:13:04.347 00:07:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:04.347 00:07:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:04.606 BaseBdev2_malloc 00:13:04.606 00:07:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:04.865 true 00:13:04.865 00:07:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:05.123 [2024-07-16 00:07:51.921355] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:05.123 [2024-07-16 00:07:51.921401] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:05.123 [2024-07-16 00:07:51.921421] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18d3910 00:13:05.123 [2024-07-16 00:07:51.921439] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:05.123 [2024-07-16 00:07:51.922833] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:05.123 [2024-07-16 00:07:51.922864] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:05.123 BaseBdev2 00:13:05.123 00:07:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:05.382 [2024-07-16 00:07:52.174053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:05.382 [2024-07-16 00:07:52.175372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:05.382 [2024-07-16 00:07:52.175563] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18d5320 00:13:05.382 [2024-07-16 00:07:52.175576] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:05.382 [2024-07-16 00:07:52.175771] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x173cd00 00:13:05.382 [2024-07-16 00:07:52.175922] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18d5320 00:13:05.382 [2024-07-16 00:07:52.175943] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18d5320 00:13:05.382 [2024-07-16 00:07:52.176052] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.382 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:05.641 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.641 "name": "raid_bdev1", 00:13:05.641 "uuid": "7c218dd8-9de1-45a9-95f2-8d92959401b4", 00:13:05.641 "strip_size_kb": 0, 00:13:05.641 "state": "online", 00:13:05.641 "raid_level": "raid1", 00:13:05.641 "superblock": true, 00:13:05.641 "num_base_bdevs": 2, 00:13:05.641 "num_base_bdevs_discovered": 2, 00:13:05.641 "num_base_bdevs_operational": 2, 00:13:05.641 "base_bdevs_list": [ 00:13:05.641 { 00:13:05.641 "name": "BaseBdev1", 00:13:05.641 "uuid": "826bc446-2dce-518a-b14a-621c90783441", 00:13:05.641 "is_configured": true, 00:13:05.641 "data_offset": 2048, 00:13:05.641 "data_size": 63488 00:13:05.641 }, 00:13:05.641 { 00:13:05.641 "name": "BaseBdev2", 00:13:05.641 "uuid": "79d8026b-f2e6-5755-8388-229df1d88420", 00:13:05.641 "is_configured": true, 00:13:05.641 "data_offset": 2048, 00:13:05.641 "data_size": 63488 00:13:05.641 } 00:13:05.641 ] 00:13:05.641 }' 00:13:05.641 00:07:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.641 00:07:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.207 00:07:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:06.207 00:07:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:06.466 [2024-07-16 00:07:53.168965] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18d0c70 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:07.402 [2024-07-16 00:07:54.292114] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:07.402 [2024-07-16 00:07:54.292174] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:07.402 [2024-07-16 00:07:54.292354] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x18d0c70 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:07.402 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.659 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:07.659 "name": "raid_bdev1", 00:13:07.659 "uuid": "7c218dd8-9de1-45a9-95f2-8d92959401b4", 00:13:07.659 "strip_size_kb": 0, 00:13:07.659 "state": "online", 00:13:07.659 "raid_level": "raid1", 00:13:07.659 "superblock": true, 00:13:07.659 "num_base_bdevs": 2, 00:13:07.659 "num_base_bdevs_discovered": 1, 00:13:07.659 "num_base_bdevs_operational": 1, 00:13:07.659 "base_bdevs_list": [ 00:13:07.659 { 00:13:07.659 "name": null, 00:13:07.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.659 "is_configured": false, 00:13:07.659 "data_offset": 2048, 00:13:07.659 "data_size": 63488 00:13:07.659 }, 00:13:07.659 { 00:13:07.659 "name": "BaseBdev2", 00:13:07.659 "uuid": "79d8026b-f2e6-5755-8388-229df1d88420", 00:13:07.659 "is_configured": true, 00:13:07.659 "data_offset": 2048, 00:13:07.659 "data_size": 63488 00:13:07.659 } 00:13:07.659 ] 00:13:07.659 }' 00:13:07.659 00:07:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:07.659 00:07:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:08.593 [2024-07-16 00:07:55.408491] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:08.593 [2024-07-16 00:07:55.408529] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:08.593 [2024-07-16 00:07:55.411688] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:08.593 [2024-07-16 00:07:55.411714] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:08.593 [2024-07-16 00:07:55.411765] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:08.593 [2024-07-16 00:07:55.411777] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18d5320 name raid_bdev1, state offline 00:13:08.593 0 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3507972 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3507972 ']' 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3507972 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3507972 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3507972' 00:13:08.593 killing process with pid 3507972 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3507972 00:13:08.593 [2024-07-16 00:07:55.492877] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:08.593 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3507972 00:13:08.593 [2024-07-16 00:07:55.503610] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:08.852 00:07:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.9s8oC9nPNn 00:13:08.852 00:07:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:08.852 00:07:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:08.852 00:07:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:08.852 00:07:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:08.852 00:07:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:08.852 00:07:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:08.852 00:07:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:08.852 00:13:08.852 real 0m6.278s 00:13:08.852 user 0m9.792s 00:13:08.852 sys 0m1.132s 00:13:08.852 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:08.852 00:07:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.852 ************************************ 00:13:08.852 END TEST raid_write_error_test 00:13:08.852 ************************************ 00:13:08.852 00:07:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:08.852 00:07:55 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:13:08.852 00:07:55 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:08.852 00:07:55 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:08.852 00:07:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:08.852 00:07:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:08.852 00:07:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:09.111 ************************************ 00:13:09.111 START TEST raid_state_function_test 00:13:09.111 ************************************ 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3508820 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3508820' 00:13:09.111 Process raid pid: 3508820 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3508820 /var/tmp/spdk-raid.sock 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3508820 ']' 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:09.111 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:09.111 00:07:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.111 [2024-07-16 00:07:55.903091] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:13:09.111 [2024-07-16 00:07:55.903169] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:09.111 [2024-07-16 00:07:56.036220] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:09.370 [2024-07-16 00:07:56.138878] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:09.370 [2024-07-16 00:07:56.204201] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:09.370 [2024-07-16 00:07:56.204240] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:09.938 00:07:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:09.938 00:07:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:09.938 00:07:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:10.197 [2024-07-16 00:07:57.059609] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:10.197 [2024-07-16 00:07:57.059656] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:10.197 [2024-07-16 00:07:57.059668] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:10.197 [2024-07-16 00:07:57.059680] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:10.197 [2024-07-16 00:07:57.059689] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:10.197 [2024-07-16 00:07:57.059700] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.197 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.456 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.456 "name": "Existed_Raid", 00:13:10.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.456 "strip_size_kb": 64, 00:13:10.456 "state": "configuring", 00:13:10.456 "raid_level": "raid0", 00:13:10.456 "superblock": false, 00:13:10.456 "num_base_bdevs": 3, 00:13:10.456 "num_base_bdevs_discovered": 0, 00:13:10.456 "num_base_bdevs_operational": 3, 00:13:10.456 "base_bdevs_list": [ 00:13:10.456 { 00:13:10.456 "name": "BaseBdev1", 00:13:10.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.456 "is_configured": false, 00:13:10.456 "data_offset": 0, 00:13:10.456 "data_size": 0 00:13:10.456 }, 00:13:10.456 { 00:13:10.456 "name": "BaseBdev2", 00:13:10.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.456 "is_configured": false, 00:13:10.456 "data_offset": 0, 00:13:10.456 "data_size": 0 00:13:10.456 }, 00:13:10.456 { 00:13:10.456 "name": "BaseBdev3", 00:13:10.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.456 "is_configured": false, 00:13:10.456 "data_offset": 0, 00:13:10.456 "data_size": 0 00:13:10.456 } 00:13:10.456 ] 00:13:10.456 }' 00:13:10.456 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.456 00:07:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.023 00:07:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:11.282 [2024-07-16 00:07:58.190469] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:11.282 [2024-07-16 00:07:58.190497] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2159a80 name Existed_Raid, state configuring 00:13:11.282 00:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:11.540 [2024-07-16 00:07:58.439132] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:11.540 [2024-07-16 00:07:58.439159] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:11.540 [2024-07-16 00:07:58.439168] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:11.540 [2024-07-16 00:07:58.439187] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:11.540 [2024-07-16 00:07:58.439196] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:11.540 [2024-07-16 00:07:58.439207] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:11.540 00:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:11.798 [2024-07-16 00:07:58.693723] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.798 BaseBdev1 00:13:11.798 00:07:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:11.798 00:07:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:11.798 00:07:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:11.798 00:07:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:11.798 00:07:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:11.798 00:07:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:11.798 00:07:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:12.055 00:07:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:12.313 [ 00:13:12.313 { 00:13:12.313 "name": "BaseBdev1", 00:13:12.313 "aliases": [ 00:13:12.313 "3f20d924-2aff-47ae-a93a-0b399e09e959" 00:13:12.313 ], 00:13:12.313 "product_name": "Malloc disk", 00:13:12.313 "block_size": 512, 00:13:12.313 "num_blocks": 65536, 00:13:12.313 "uuid": "3f20d924-2aff-47ae-a93a-0b399e09e959", 00:13:12.313 "assigned_rate_limits": { 00:13:12.313 "rw_ios_per_sec": 0, 00:13:12.313 "rw_mbytes_per_sec": 0, 00:13:12.313 "r_mbytes_per_sec": 0, 00:13:12.313 "w_mbytes_per_sec": 0 00:13:12.313 }, 00:13:12.313 "claimed": true, 00:13:12.313 "claim_type": "exclusive_write", 00:13:12.313 "zoned": false, 00:13:12.313 "supported_io_types": { 00:13:12.313 "read": true, 00:13:12.313 "write": true, 00:13:12.313 "unmap": true, 00:13:12.313 "flush": true, 00:13:12.313 "reset": true, 00:13:12.313 "nvme_admin": false, 00:13:12.313 "nvme_io": false, 00:13:12.313 "nvme_io_md": false, 00:13:12.313 "write_zeroes": true, 00:13:12.313 "zcopy": true, 00:13:12.313 "get_zone_info": false, 00:13:12.313 "zone_management": false, 00:13:12.313 "zone_append": false, 00:13:12.313 "compare": false, 00:13:12.313 "compare_and_write": false, 00:13:12.313 "abort": true, 00:13:12.313 "seek_hole": false, 00:13:12.313 "seek_data": false, 00:13:12.313 "copy": true, 00:13:12.313 "nvme_iov_md": false 00:13:12.313 }, 00:13:12.313 "memory_domains": [ 00:13:12.313 { 00:13:12.313 "dma_device_id": "system", 00:13:12.313 "dma_device_type": 1 00:13:12.313 }, 00:13:12.313 { 00:13:12.313 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.313 "dma_device_type": 2 00:13:12.313 } 00:13:12.313 ], 00:13:12.313 "driver_specific": {} 00:13:12.313 } 00:13:12.313 ] 00:13:12.313 00:07:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:12.313 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:12.313 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:12.313 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:12.313 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:12.313 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:12.313 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:12.314 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.314 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.314 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.314 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.314 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.314 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.572 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.572 "name": "Existed_Raid", 00:13:12.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.572 "strip_size_kb": 64, 00:13:12.572 "state": "configuring", 00:13:12.572 "raid_level": "raid0", 00:13:12.572 "superblock": false, 00:13:12.572 "num_base_bdevs": 3, 00:13:12.572 "num_base_bdevs_discovered": 1, 00:13:12.572 "num_base_bdevs_operational": 3, 00:13:12.572 "base_bdevs_list": [ 00:13:12.572 { 00:13:12.572 "name": "BaseBdev1", 00:13:12.572 "uuid": "3f20d924-2aff-47ae-a93a-0b399e09e959", 00:13:12.572 "is_configured": true, 00:13:12.572 "data_offset": 0, 00:13:12.572 "data_size": 65536 00:13:12.572 }, 00:13:12.572 { 00:13:12.572 "name": "BaseBdev2", 00:13:12.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.572 "is_configured": false, 00:13:12.572 "data_offset": 0, 00:13:12.572 "data_size": 0 00:13:12.572 }, 00:13:12.572 { 00:13:12.572 "name": "BaseBdev3", 00:13:12.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.572 "is_configured": false, 00:13:12.572 "data_offset": 0, 00:13:12.572 "data_size": 0 00:13:12.572 } 00:13:12.572 ] 00:13:12.572 }' 00:13:12.573 00:07:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.573 00:07:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.139 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:13.398 [2024-07-16 00:08:00.297997] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:13.398 [2024-07-16 00:08:00.298040] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2159310 name Existed_Raid, state configuring 00:13:13.398 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:13.657 [2024-07-16 00:08:00.558708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:13.657 [2024-07-16 00:08:00.560174] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:13.657 [2024-07-16 00:08:00.560209] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:13.657 [2024-07-16 00:08:00.560220] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:13.657 [2024-07-16 00:08:00.560232] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.657 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.916 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.916 "name": "Existed_Raid", 00:13:13.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:13.916 "strip_size_kb": 64, 00:13:13.916 "state": "configuring", 00:13:13.916 "raid_level": "raid0", 00:13:13.916 "superblock": false, 00:13:13.916 "num_base_bdevs": 3, 00:13:13.916 "num_base_bdevs_discovered": 1, 00:13:13.916 "num_base_bdevs_operational": 3, 00:13:13.916 "base_bdevs_list": [ 00:13:13.916 { 00:13:13.916 "name": "BaseBdev1", 00:13:13.916 "uuid": "3f20d924-2aff-47ae-a93a-0b399e09e959", 00:13:13.916 "is_configured": true, 00:13:13.916 "data_offset": 0, 00:13:13.916 "data_size": 65536 00:13:13.916 }, 00:13:13.916 { 00:13:13.916 "name": "BaseBdev2", 00:13:13.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:13.916 "is_configured": false, 00:13:13.916 "data_offset": 0, 00:13:13.916 "data_size": 0 00:13:13.916 }, 00:13:13.916 { 00:13:13.916 "name": "BaseBdev3", 00:13:13.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:13.916 "is_configured": false, 00:13:13.916 "data_offset": 0, 00:13:13.916 "data_size": 0 00:13:13.916 } 00:13:13.916 ] 00:13:13.916 }' 00:13:13.916 00:08:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.916 00:08:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.481 00:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:14.738 [2024-07-16 00:08:01.588908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:14.738 BaseBdev2 00:13:14.738 00:08:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:14.738 00:08:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:14.738 00:08:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:14.738 00:08:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:14.738 00:08:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:14.738 00:08:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:14.739 00:08:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:14.996 00:08:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:15.254 [ 00:13:15.254 { 00:13:15.254 "name": "BaseBdev2", 00:13:15.254 "aliases": [ 00:13:15.254 "8c35150c-fd0e-4bbe-b580-af1ef1719bb2" 00:13:15.254 ], 00:13:15.254 "product_name": "Malloc disk", 00:13:15.254 "block_size": 512, 00:13:15.254 "num_blocks": 65536, 00:13:15.254 "uuid": "8c35150c-fd0e-4bbe-b580-af1ef1719bb2", 00:13:15.254 "assigned_rate_limits": { 00:13:15.254 "rw_ios_per_sec": 0, 00:13:15.254 "rw_mbytes_per_sec": 0, 00:13:15.254 "r_mbytes_per_sec": 0, 00:13:15.254 "w_mbytes_per_sec": 0 00:13:15.254 }, 00:13:15.254 "claimed": true, 00:13:15.254 "claim_type": "exclusive_write", 00:13:15.254 "zoned": false, 00:13:15.254 "supported_io_types": { 00:13:15.254 "read": true, 00:13:15.254 "write": true, 00:13:15.254 "unmap": true, 00:13:15.254 "flush": true, 00:13:15.254 "reset": true, 00:13:15.254 "nvme_admin": false, 00:13:15.254 "nvme_io": false, 00:13:15.254 "nvme_io_md": false, 00:13:15.254 "write_zeroes": true, 00:13:15.254 "zcopy": true, 00:13:15.254 "get_zone_info": false, 00:13:15.254 "zone_management": false, 00:13:15.254 "zone_append": false, 00:13:15.254 "compare": false, 00:13:15.254 "compare_and_write": false, 00:13:15.254 "abort": true, 00:13:15.254 "seek_hole": false, 00:13:15.254 "seek_data": false, 00:13:15.254 "copy": true, 00:13:15.254 "nvme_iov_md": false 00:13:15.254 }, 00:13:15.254 "memory_domains": [ 00:13:15.254 { 00:13:15.254 "dma_device_id": "system", 00:13:15.254 "dma_device_type": 1 00:13:15.254 }, 00:13:15.254 { 00:13:15.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.254 "dma_device_type": 2 00:13:15.254 } 00:13:15.254 ], 00:13:15.254 "driver_specific": {} 00:13:15.254 } 00:13:15.254 ] 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.254 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:15.512 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:15.512 "name": "Existed_Raid", 00:13:15.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:15.512 "strip_size_kb": 64, 00:13:15.512 "state": "configuring", 00:13:15.512 "raid_level": "raid0", 00:13:15.512 "superblock": false, 00:13:15.512 "num_base_bdevs": 3, 00:13:15.512 "num_base_bdevs_discovered": 2, 00:13:15.512 "num_base_bdevs_operational": 3, 00:13:15.512 "base_bdevs_list": [ 00:13:15.512 { 00:13:15.512 "name": "BaseBdev1", 00:13:15.512 "uuid": "3f20d924-2aff-47ae-a93a-0b399e09e959", 00:13:15.512 "is_configured": true, 00:13:15.512 "data_offset": 0, 00:13:15.512 "data_size": 65536 00:13:15.512 }, 00:13:15.512 { 00:13:15.512 "name": "BaseBdev2", 00:13:15.512 "uuid": "8c35150c-fd0e-4bbe-b580-af1ef1719bb2", 00:13:15.512 "is_configured": true, 00:13:15.512 "data_offset": 0, 00:13:15.512 "data_size": 65536 00:13:15.512 }, 00:13:15.512 { 00:13:15.512 "name": "BaseBdev3", 00:13:15.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:15.512 "is_configured": false, 00:13:15.512 "data_offset": 0, 00:13:15.512 "data_size": 0 00:13:15.512 } 00:13:15.512 ] 00:13:15.512 }' 00:13:15.512 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:15.512 00:08:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:16.076 00:08:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:16.333 [2024-07-16 00:08:03.205914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:16.333 [2024-07-16 00:08:03.205967] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x215a400 00:13:16.333 [2024-07-16 00:08:03.205976] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:16.333 [2024-07-16 00:08:03.206239] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2159ef0 00:13:16.333 [2024-07-16 00:08:03.206359] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x215a400 00:13:16.333 [2024-07-16 00:08:03.206369] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x215a400 00:13:16.333 [2024-07-16 00:08:03.206542] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:16.333 BaseBdev3 00:13:16.333 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:16.333 00:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:16.333 00:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:16.333 00:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:16.333 00:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:16.333 00:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:16.333 00:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:16.965 00:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:17.224 [ 00:13:17.224 { 00:13:17.224 "name": "BaseBdev3", 00:13:17.224 "aliases": [ 00:13:17.224 "ad89f73d-1267-4f6c-85eb-43e67e4286e6" 00:13:17.224 ], 00:13:17.224 "product_name": "Malloc disk", 00:13:17.224 "block_size": 512, 00:13:17.224 "num_blocks": 65536, 00:13:17.224 "uuid": "ad89f73d-1267-4f6c-85eb-43e67e4286e6", 00:13:17.224 "assigned_rate_limits": { 00:13:17.224 "rw_ios_per_sec": 0, 00:13:17.224 "rw_mbytes_per_sec": 0, 00:13:17.224 "r_mbytes_per_sec": 0, 00:13:17.224 "w_mbytes_per_sec": 0 00:13:17.224 }, 00:13:17.224 "claimed": true, 00:13:17.224 "claim_type": "exclusive_write", 00:13:17.224 "zoned": false, 00:13:17.224 "supported_io_types": { 00:13:17.224 "read": true, 00:13:17.224 "write": true, 00:13:17.224 "unmap": true, 00:13:17.224 "flush": true, 00:13:17.224 "reset": true, 00:13:17.224 "nvme_admin": false, 00:13:17.224 "nvme_io": false, 00:13:17.224 "nvme_io_md": false, 00:13:17.224 "write_zeroes": true, 00:13:17.224 "zcopy": true, 00:13:17.224 "get_zone_info": false, 00:13:17.224 "zone_management": false, 00:13:17.224 "zone_append": false, 00:13:17.224 "compare": false, 00:13:17.224 "compare_and_write": false, 00:13:17.224 "abort": true, 00:13:17.224 "seek_hole": false, 00:13:17.224 "seek_data": false, 00:13:17.224 "copy": true, 00:13:17.224 "nvme_iov_md": false 00:13:17.224 }, 00:13:17.224 "memory_domains": [ 00:13:17.224 { 00:13:17.224 "dma_device_id": "system", 00:13:17.224 "dma_device_type": 1 00:13:17.224 }, 00:13:17.224 { 00:13:17.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.224 "dma_device_type": 2 00:13:17.224 } 00:13:17.224 ], 00:13:17.224 "driver_specific": {} 00:13:17.224 } 00:13:17.224 ] 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.224 00:08:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:17.482 00:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:17.482 "name": "Existed_Raid", 00:13:17.482 "uuid": "3378d08b-15ca-495a-aedd-68ea566ba9db", 00:13:17.482 "strip_size_kb": 64, 00:13:17.482 "state": "online", 00:13:17.482 "raid_level": "raid0", 00:13:17.482 "superblock": false, 00:13:17.482 "num_base_bdevs": 3, 00:13:17.482 "num_base_bdevs_discovered": 3, 00:13:17.482 "num_base_bdevs_operational": 3, 00:13:17.482 "base_bdevs_list": [ 00:13:17.482 { 00:13:17.482 "name": "BaseBdev1", 00:13:17.482 "uuid": "3f20d924-2aff-47ae-a93a-0b399e09e959", 00:13:17.482 "is_configured": true, 00:13:17.482 "data_offset": 0, 00:13:17.482 "data_size": 65536 00:13:17.482 }, 00:13:17.482 { 00:13:17.482 "name": "BaseBdev2", 00:13:17.482 "uuid": "8c35150c-fd0e-4bbe-b580-af1ef1719bb2", 00:13:17.482 "is_configured": true, 00:13:17.482 "data_offset": 0, 00:13:17.482 "data_size": 65536 00:13:17.482 }, 00:13:17.482 { 00:13:17.482 "name": "BaseBdev3", 00:13:17.482 "uuid": "ad89f73d-1267-4f6c-85eb-43e67e4286e6", 00:13:17.482 "is_configured": true, 00:13:17.482 "data_offset": 0, 00:13:17.482 "data_size": 65536 00:13:17.482 } 00:13:17.482 ] 00:13:17.482 }' 00:13:17.482 00:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:17.482 00:08:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:18.048 00:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:18.048 00:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:18.048 00:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:18.048 00:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:18.048 00:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:18.048 00:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:18.048 00:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:18.048 00:08:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:18.306 [2024-07-16 00:08:05.047115] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:18.306 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:18.306 "name": "Existed_Raid", 00:13:18.306 "aliases": [ 00:13:18.306 "3378d08b-15ca-495a-aedd-68ea566ba9db" 00:13:18.306 ], 00:13:18.306 "product_name": "Raid Volume", 00:13:18.306 "block_size": 512, 00:13:18.306 "num_blocks": 196608, 00:13:18.306 "uuid": "3378d08b-15ca-495a-aedd-68ea566ba9db", 00:13:18.306 "assigned_rate_limits": { 00:13:18.306 "rw_ios_per_sec": 0, 00:13:18.306 "rw_mbytes_per_sec": 0, 00:13:18.306 "r_mbytes_per_sec": 0, 00:13:18.307 "w_mbytes_per_sec": 0 00:13:18.307 }, 00:13:18.307 "claimed": false, 00:13:18.307 "zoned": false, 00:13:18.307 "supported_io_types": { 00:13:18.307 "read": true, 00:13:18.307 "write": true, 00:13:18.307 "unmap": true, 00:13:18.307 "flush": true, 00:13:18.307 "reset": true, 00:13:18.307 "nvme_admin": false, 00:13:18.307 "nvme_io": false, 00:13:18.307 "nvme_io_md": false, 00:13:18.307 "write_zeroes": true, 00:13:18.307 "zcopy": false, 00:13:18.307 "get_zone_info": false, 00:13:18.307 "zone_management": false, 00:13:18.307 "zone_append": false, 00:13:18.307 "compare": false, 00:13:18.307 "compare_and_write": false, 00:13:18.307 "abort": false, 00:13:18.307 "seek_hole": false, 00:13:18.307 "seek_data": false, 00:13:18.307 "copy": false, 00:13:18.307 "nvme_iov_md": false 00:13:18.307 }, 00:13:18.307 "memory_domains": [ 00:13:18.307 { 00:13:18.307 "dma_device_id": "system", 00:13:18.307 "dma_device_type": 1 00:13:18.307 }, 00:13:18.307 { 00:13:18.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.307 "dma_device_type": 2 00:13:18.307 }, 00:13:18.307 { 00:13:18.307 "dma_device_id": "system", 00:13:18.307 "dma_device_type": 1 00:13:18.307 }, 00:13:18.307 { 00:13:18.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.307 "dma_device_type": 2 00:13:18.307 }, 00:13:18.307 { 00:13:18.307 "dma_device_id": "system", 00:13:18.307 "dma_device_type": 1 00:13:18.307 }, 00:13:18.307 { 00:13:18.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.307 "dma_device_type": 2 00:13:18.307 } 00:13:18.307 ], 00:13:18.307 "driver_specific": { 00:13:18.307 "raid": { 00:13:18.307 "uuid": "3378d08b-15ca-495a-aedd-68ea566ba9db", 00:13:18.307 "strip_size_kb": 64, 00:13:18.307 "state": "online", 00:13:18.307 "raid_level": "raid0", 00:13:18.307 "superblock": false, 00:13:18.307 "num_base_bdevs": 3, 00:13:18.307 "num_base_bdevs_discovered": 3, 00:13:18.307 "num_base_bdevs_operational": 3, 00:13:18.307 "base_bdevs_list": [ 00:13:18.307 { 00:13:18.307 "name": "BaseBdev1", 00:13:18.307 "uuid": "3f20d924-2aff-47ae-a93a-0b399e09e959", 00:13:18.307 "is_configured": true, 00:13:18.307 "data_offset": 0, 00:13:18.307 "data_size": 65536 00:13:18.307 }, 00:13:18.307 { 00:13:18.307 "name": "BaseBdev2", 00:13:18.307 "uuid": "8c35150c-fd0e-4bbe-b580-af1ef1719bb2", 00:13:18.307 "is_configured": true, 00:13:18.307 "data_offset": 0, 00:13:18.307 "data_size": 65536 00:13:18.307 }, 00:13:18.307 { 00:13:18.307 "name": "BaseBdev3", 00:13:18.307 "uuid": "ad89f73d-1267-4f6c-85eb-43e67e4286e6", 00:13:18.307 "is_configured": true, 00:13:18.307 "data_offset": 0, 00:13:18.307 "data_size": 65536 00:13:18.307 } 00:13:18.307 ] 00:13:18.307 } 00:13:18.307 } 00:13:18.307 }' 00:13:18.307 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:18.307 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:18.307 BaseBdev2 00:13:18.307 BaseBdev3' 00:13:18.307 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:18.307 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:18.307 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:18.566 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:18.566 "name": "BaseBdev1", 00:13:18.566 "aliases": [ 00:13:18.566 "3f20d924-2aff-47ae-a93a-0b399e09e959" 00:13:18.566 ], 00:13:18.566 "product_name": "Malloc disk", 00:13:18.566 "block_size": 512, 00:13:18.566 "num_blocks": 65536, 00:13:18.566 "uuid": "3f20d924-2aff-47ae-a93a-0b399e09e959", 00:13:18.566 "assigned_rate_limits": { 00:13:18.566 "rw_ios_per_sec": 0, 00:13:18.566 "rw_mbytes_per_sec": 0, 00:13:18.566 "r_mbytes_per_sec": 0, 00:13:18.566 "w_mbytes_per_sec": 0 00:13:18.566 }, 00:13:18.566 "claimed": true, 00:13:18.566 "claim_type": "exclusive_write", 00:13:18.566 "zoned": false, 00:13:18.566 "supported_io_types": { 00:13:18.566 "read": true, 00:13:18.566 "write": true, 00:13:18.566 "unmap": true, 00:13:18.566 "flush": true, 00:13:18.566 "reset": true, 00:13:18.566 "nvme_admin": false, 00:13:18.566 "nvme_io": false, 00:13:18.566 "nvme_io_md": false, 00:13:18.566 "write_zeroes": true, 00:13:18.566 "zcopy": true, 00:13:18.566 "get_zone_info": false, 00:13:18.566 "zone_management": false, 00:13:18.566 "zone_append": false, 00:13:18.566 "compare": false, 00:13:18.566 "compare_and_write": false, 00:13:18.566 "abort": true, 00:13:18.566 "seek_hole": false, 00:13:18.566 "seek_data": false, 00:13:18.566 "copy": true, 00:13:18.566 "nvme_iov_md": false 00:13:18.566 }, 00:13:18.566 "memory_domains": [ 00:13:18.566 { 00:13:18.566 "dma_device_id": "system", 00:13:18.566 "dma_device_type": 1 00:13:18.566 }, 00:13:18.566 { 00:13:18.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.566 "dma_device_type": 2 00:13:18.566 } 00:13:18.566 ], 00:13:18.566 "driver_specific": {} 00:13:18.566 }' 00:13:18.566 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.566 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.566 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:18.566 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:18.825 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:19.083 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:19.083 "name": "BaseBdev2", 00:13:19.083 "aliases": [ 00:13:19.083 "8c35150c-fd0e-4bbe-b580-af1ef1719bb2" 00:13:19.083 ], 00:13:19.083 "product_name": "Malloc disk", 00:13:19.083 "block_size": 512, 00:13:19.083 "num_blocks": 65536, 00:13:19.083 "uuid": "8c35150c-fd0e-4bbe-b580-af1ef1719bb2", 00:13:19.083 "assigned_rate_limits": { 00:13:19.083 "rw_ios_per_sec": 0, 00:13:19.083 "rw_mbytes_per_sec": 0, 00:13:19.083 "r_mbytes_per_sec": 0, 00:13:19.083 "w_mbytes_per_sec": 0 00:13:19.083 }, 00:13:19.083 "claimed": true, 00:13:19.083 "claim_type": "exclusive_write", 00:13:19.083 "zoned": false, 00:13:19.083 "supported_io_types": { 00:13:19.083 "read": true, 00:13:19.083 "write": true, 00:13:19.083 "unmap": true, 00:13:19.083 "flush": true, 00:13:19.083 "reset": true, 00:13:19.083 "nvme_admin": false, 00:13:19.083 "nvme_io": false, 00:13:19.083 "nvme_io_md": false, 00:13:19.083 "write_zeroes": true, 00:13:19.083 "zcopy": true, 00:13:19.083 "get_zone_info": false, 00:13:19.083 "zone_management": false, 00:13:19.083 "zone_append": false, 00:13:19.083 "compare": false, 00:13:19.083 "compare_and_write": false, 00:13:19.083 "abort": true, 00:13:19.083 "seek_hole": false, 00:13:19.083 "seek_data": false, 00:13:19.083 "copy": true, 00:13:19.083 "nvme_iov_md": false 00:13:19.083 }, 00:13:19.083 "memory_domains": [ 00:13:19.083 { 00:13:19.083 "dma_device_id": "system", 00:13:19.083 "dma_device_type": 1 00:13:19.083 }, 00:13:19.083 { 00:13:19.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.084 "dma_device_type": 2 00:13:19.084 } 00:13:19.084 ], 00:13:19.084 "driver_specific": {} 00:13:19.084 }' 00:13:19.084 00:08:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:19.084 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:19.342 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:19.342 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:19.342 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:19.342 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:19.342 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:19.342 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:19.342 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:19.342 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:19.342 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:19.600 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:19.600 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:19.600 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:19.600 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:19.859 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:19.859 "name": "BaseBdev3", 00:13:19.859 "aliases": [ 00:13:19.859 "ad89f73d-1267-4f6c-85eb-43e67e4286e6" 00:13:19.859 ], 00:13:19.859 "product_name": "Malloc disk", 00:13:19.859 "block_size": 512, 00:13:19.859 "num_blocks": 65536, 00:13:19.859 "uuid": "ad89f73d-1267-4f6c-85eb-43e67e4286e6", 00:13:19.859 "assigned_rate_limits": { 00:13:19.859 "rw_ios_per_sec": 0, 00:13:19.859 "rw_mbytes_per_sec": 0, 00:13:19.859 "r_mbytes_per_sec": 0, 00:13:19.859 "w_mbytes_per_sec": 0 00:13:19.859 }, 00:13:19.859 "claimed": true, 00:13:19.859 "claim_type": "exclusive_write", 00:13:19.859 "zoned": false, 00:13:19.859 "supported_io_types": { 00:13:19.859 "read": true, 00:13:19.859 "write": true, 00:13:19.859 "unmap": true, 00:13:19.859 "flush": true, 00:13:19.859 "reset": true, 00:13:19.859 "nvme_admin": false, 00:13:19.859 "nvme_io": false, 00:13:19.859 "nvme_io_md": false, 00:13:19.859 "write_zeroes": true, 00:13:19.859 "zcopy": true, 00:13:19.859 "get_zone_info": false, 00:13:19.859 "zone_management": false, 00:13:19.859 "zone_append": false, 00:13:19.859 "compare": false, 00:13:19.859 "compare_and_write": false, 00:13:19.859 "abort": true, 00:13:19.859 "seek_hole": false, 00:13:19.859 "seek_data": false, 00:13:19.859 "copy": true, 00:13:19.859 "nvme_iov_md": false 00:13:19.859 }, 00:13:19.859 "memory_domains": [ 00:13:19.859 { 00:13:19.859 "dma_device_id": "system", 00:13:19.859 "dma_device_type": 1 00:13:19.859 }, 00:13:19.859 { 00:13:19.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.859 "dma_device_type": 2 00:13:19.859 } 00:13:19.859 ], 00:13:19.859 "driver_specific": {} 00:13:19.859 }' 00:13:19.859 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:19.859 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:19.859 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:19.859 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:19.859 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:19.859 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:19.859 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:20.118 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:20.118 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:20.118 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:20.118 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:20.118 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:20.118 00:08:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:20.685 [2024-07-16 00:08:07.421357] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:20.685 [2024-07-16 00:08:07.421389] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:20.685 [2024-07-16 00:08:07.421438] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.685 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:20.944 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.944 "name": "Existed_Raid", 00:13:20.944 "uuid": "3378d08b-15ca-495a-aedd-68ea566ba9db", 00:13:20.944 "strip_size_kb": 64, 00:13:20.944 "state": "offline", 00:13:20.944 "raid_level": "raid0", 00:13:20.944 "superblock": false, 00:13:20.944 "num_base_bdevs": 3, 00:13:20.944 "num_base_bdevs_discovered": 2, 00:13:20.944 "num_base_bdevs_operational": 2, 00:13:20.944 "base_bdevs_list": [ 00:13:20.944 { 00:13:20.944 "name": null, 00:13:20.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.944 "is_configured": false, 00:13:20.944 "data_offset": 0, 00:13:20.944 "data_size": 65536 00:13:20.944 }, 00:13:20.944 { 00:13:20.944 "name": "BaseBdev2", 00:13:20.944 "uuid": "8c35150c-fd0e-4bbe-b580-af1ef1719bb2", 00:13:20.944 "is_configured": true, 00:13:20.944 "data_offset": 0, 00:13:20.944 "data_size": 65536 00:13:20.944 }, 00:13:20.944 { 00:13:20.944 "name": "BaseBdev3", 00:13:20.944 "uuid": "ad89f73d-1267-4f6c-85eb-43e67e4286e6", 00:13:20.944 "is_configured": true, 00:13:20.944 "data_offset": 0, 00:13:20.944 "data_size": 65536 00:13:20.944 } 00:13:20.944 ] 00:13:20.944 }' 00:13:20.944 00:08:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.944 00:08:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.512 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:21.512 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:21.512 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.512 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:21.771 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:21.771 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:21.771 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:22.029 [2024-07-16 00:08:08.810968] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:22.029 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:22.029 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:22.030 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.030 00:08:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:22.288 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:22.288 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:22.288 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:22.547 [2024-07-16 00:08:09.326699] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:22.547 [2024-07-16 00:08:09.326744] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x215a400 name Existed_Raid, state offline 00:13:22.547 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:22.547 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:22.547 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.547 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:22.806 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:22.806 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:22.806 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:22.806 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:22.806 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:22.806 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:23.064 BaseBdev2 00:13:23.064 00:08:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:23.064 00:08:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:23.064 00:08:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:23.064 00:08:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:23.064 00:08:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:23.064 00:08:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:23.064 00:08:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:23.323 00:08:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:23.582 [ 00:13:23.582 { 00:13:23.582 "name": "BaseBdev2", 00:13:23.582 "aliases": [ 00:13:23.582 "eaaede6b-e793-4b6a-9d8e-692516c73db0" 00:13:23.582 ], 00:13:23.582 "product_name": "Malloc disk", 00:13:23.582 "block_size": 512, 00:13:23.582 "num_blocks": 65536, 00:13:23.582 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:23.582 "assigned_rate_limits": { 00:13:23.582 "rw_ios_per_sec": 0, 00:13:23.582 "rw_mbytes_per_sec": 0, 00:13:23.582 "r_mbytes_per_sec": 0, 00:13:23.582 "w_mbytes_per_sec": 0 00:13:23.582 }, 00:13:23.582 "claimed": false, 00:13:23.582 "zoned": false, 00:13:23.582 "supported_io_types": { 00:13:23.582 "read": true, 00:13:23.582 "write": true, 00:13:23.582 "unmap": true, 00:13:23.582 "flush": true, 00:13:23.582 "reset": true, 00:13:23.582 "nvme_admin": false, 00:13:23.582 "nvme_io": false, 00:13:23.582 "nvme_io_md": false, 00:13:23.582 "write_zeroes": true, 00:13:23.582 "zcopy": true, 00:13:23.582 "get_zone_info": false, 00:13:23.582 "zone_management": false, 00:13:23.582 "zone_append": false, 00:13:23.582 "compare": false, 00:13:23.582 "compare_and_write": false, 00:13:23.582 "abort": true, 00:13:23.582 "seek_hole": false, 00:13:23.582 "seek_data": false, 00:13:23.582 "copy": true, 00:13:23.582 "nvme_iov_md": false 00:13:23.582 }, 00:13:23.582 "memory_domains": [ 00:13:23.582 { 00:13:23.582 "dma_device_id": "system", 00:13:23.582 "dma_device_type": 1 00:13:23.582 }, 00:13:23.582 { 00:13:23.582 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.582 "dma_device_type": 2 00:13:23.582 } 00:13:23.582 ], 00:13:23.582 "driver_specific": {} 00:13:23.582 } 00:13:23.582 ] 00:13:23.582 00:08:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:23.582 00:08:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:23.582 00:08:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:23.582 00:08:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:24.149 BaseBdev3 00:13:24.149 00:08:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:24.149 00:08:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:24.149 00:08:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:24.149 00:08:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:24.149 00:08:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:24.149 00:08:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:24.149 00:08:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:24.407 00:08:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:24.407 [ 00:13:24.407 { 00:13:24.407 "name": "BaseBdev3", 00:13:24.407 "aliases": [ 00:13:24.407 "85a604c9-da87-4dd7-8b05-4cf8f276044f" 00:13:24.407 ], 00:13:24.407 "product_name": "Malloc disk", 00:13:24.407 "block_size": 512, 00:13:24.407 "num_blocks": 65536, 00:13:24.407 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:24.407 "assigned_rate_limits": { 00:13:24.407 "rw_ios_per_sec": 0, 00:13:24.407 "rw_mbytes_per_sec": 0, 00:13:24.407 "r_mbytes_per_sec": 0, 00:13:24.407 "w_mbytes_per_sec": 0 00:13:24.407 }, 00:13:24.407 "claimed": false, 00:13:24.407 "zoned": false, 00:13:24.407 "supported_io_types": { 00:13:24.407 "read": true, 00:13:24.407 "write": true, 00:13:24.407 "unmap": true, 00:13:24.407 "flush": true, 00:13:24.407 "reset": true, 00:13:24.407 "nvme_admin": false, 00:13:24.407 "nvme_io": false, 00:13:24.407 "nvme_io_md": false, 00:13:24.407 "write_zeroes": true, 00:13:24.407 "zcopy": true, 00:13:24.407 "get_zone_info": false, 00:13:24.407 "zone_management": false, 00:13:24.407 "zone_append": false, 00:13:24.407 "compare": false, 00:13:24.407 "compare_and_write": false, 00:13:24.407 "abort": true, 00:13:24.407 "seek_hole": false, 00:13:24.407 "seek_data": false, 00:13:24.407 "copy": true, 00:13:24.407 "nvme_iov_md": false 00:13:24.407 }, 00:13:24.407 "memory_domains": [ 00:13:24.407 { 00:13:24.407 "dma_device_id": "system", 00:13:24.407 "dma_device_type": 1 00:13:24.407 }, 00:13:24.407 { 00:13:24.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.407 "dma_device_type": 2 00:13:24.407 } 00:13:24.407 ], 00:13:24.407 "driver_specific": {} 00:13:24.407 } 00:13:24.407 ] 00:13:24.407 00:08:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:24.407 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:24.407 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:24.407 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:24.667 [2024-07-16 00:08:11.566993] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:24.667 [2024-07-16 00:08:11.567038] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:24.667 [2024-07-16 00:08:11.567056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:24.667 [2024-07-16 00:08:11.568379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.667 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:24.926 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.926 "name": "Existed_Raid", 00:13:24.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.927 "strip_size_kb": 64, 00:13:24.927 "state": "configuring", 00:13:24.927 "raid_level": "raid0", 00:13:24.927 "superblock": false, 00:13:24.927 "num_base_bdevs": 3, 00:13:24.927 "num_base_bdevs_discovered": 2, 00:13:24.927 "num_base_bdevs_operational": 3, 00:13:24.927 "base_bdevs_list": [ 00:13:24.927 { 00:13:24.927 "name": "BaseBdev1", 00:13:24.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.927 "is_configured": false, 00:13:24.927 "data_offset": 0, 00:13:24.927 "data_size": 0 00:13:24.927 }, 00:13:24.927 { 00:13:24.927 "name": "BaseBdev2", 00:13:24.927 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:24.927 "is_configured": true, 00:13:24.927 "data_offset": 0, 00:13:24.927 "data_size": 65536 00:13:24.927 }, 00:13:24.927 { 00:13:24.927 "name": "BaseBdev3", 00:13:24.927 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:24.927 "is_configured": true, 00:13:24.927 "data_offset": 0, 00:13:24.927 "data_size": 65536 00:13:24.927 } 00:13:24.927 ] 00:13:24.927 }' 00:13:24.927 00:08:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.927 00:08:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.493 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:25.751 [2024-07-16 00:08:12.626019] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.751 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.009 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.009 "name": "Existed_Raid", 00:13:26.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.009 "strip_size_kb": 64, 00:13:26.009 "state": "configuring", 00:13:26.009 "raid_level": "raid0", 00:13:26.009 "superblock": false, 00:13:26.009 "num_base_bdevs": 3, 00:13:26.009 "num_base_bdevs_discovered": 1, 00:13:26.009 "num_base_bdevs_operational": 3, 00:13:26.009 "base_bdevs_list": [ 00:13:26.009 { 00:13:26.009 "name": "BaseBdev1", 00:13:26.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.009 "is_configured": false, 00:13:26.009 "data_offset": 0, 00:13:26.009 "data_size": 0 00:13:26.009 }, 00:13:26.009 { 00:13:26.009 "name": null, 00:13:26.009 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:26.009 "is_configured": false, 00:13:26.009 "data_offset": 0, 00:13:26.009 "data_size": 65536 00:13:26.009 }, 00:13:26.009 { 00:13:26.009 "name": "BaseBdev3", 00:13:26.009 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:26.009 "is_configured": true, 00:13:26.009 "data_offset": 0, 00:13:26.009 "data_size": 65536 00:13:26.009 } 00:13:26.009 ] 00:13:26.009 }' 00:13:26.009 00:08:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.009 00:08:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:26.574 00:08:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.574 00:08:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:26.832 00:08:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:26.832 00:08:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:27.090 [2024-07-16 00:08:13.974214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:27.091 BaseBdev1 00:13:27.091 00:08:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:27.091 00:08:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:27.091 00:08:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:27.091 00:08:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:27.091 00:08:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:27.091 00:08:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:27.091 00:08:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:27.349 00:08:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:27.607 [ 00:13:27.607 { 00:13:27.607 "name": "BaseBdev1", 00:13:27.607 "aliases": [ 00:13:27.607 "6a0af206-176a-45f5-89bc-d3b2e9a53223" 00:13:27.607 ], 00:13:27.607 "product_name": "Malloc disk", 00:13:27.607 "block_size": 512, 00:13:27.607 "num_blocks": 65536, 00:13:27.607 "uuid": "6a0af206-176a-45f5-89bc-d3b2e9a53223", 00:13:27.607 "assigned_rate_limits": { 00:13:27.607 "rw_ios_per_sec": 0, 00:13:27.607 "rw_mbytes_per_sec": 0, 00:13:27.607 "r_mbytes_per_sec": 0, 00:13:27.607 "w_mbytes_per_sec": 0 00:13:27.607 }, 00:13:27.607 "claimed": true, 00:13:27.607 "claim_type": "exclusive_write", 00:13:27.607 "zoned": false, 00:13:27.607 "supported_io_types": { 00:13:27.607 "read": true, 00:13:27.607 "write": true, 00:13:27.607 "unmap": true, 00:13:27.607 "flush": true, 00:13:27.607 "reset": true, 00:13:27.607 "nvme_admin": false, 00:13:27.607 "nvme_io": false, 00:13:27.607 "nvme_io_md": false, 00:13:27.607 "write_zeroes": true, 00:13:27.607 "zcopy": true, 00:13:27.607 "get_zone_info": false, 00:13:27.607 "zone_management": false, 00:13:27.607 "zone_append": false, 00:13:27.607 "compare": false, 00:13:27.607 "compare_and_write": false, 00:13:27.607 "abort": true, 00:13:27.607 "seek_hole": false, 00:13:27.607 "seek_data": false, 00:13:27.608 "copy": true, 00:13:27.608 "nvme_iov_md": false 00:13:27.608 }, 00:13:27.608 "memory_domains": [ 00:13:27.608 { 00:13:27.608 "dma_device_id": "system", 00:13:27.608 "dma_device_type": 1 00:13:27.608 }, 00:13:27.608 { 00:13:27.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.608 "dma_device_type": 2 00:13:27.608 } 00:13:27.608 ], 00:13:27.608 "driver_specific": {} 00:13:27.608 } 00:13:27.608 ] 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.608 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:27.865 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:27.865 "name": "Existed_Raid", 00:13:27.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:27.865 "strip_size_kb": 64, 00:13:27.865 "state": "configuring", 00:13:27.865 "raid_level": "raid0", 00:13:27.865 "superblock": false, 00:13:27.865 "num_base_bdevs": 3, 00:13:27.865 "num_base_bdevs_discovered": 2, 00:13:27.865 "num_base_bdevs_operational": 3, 00:13:27.865 "base_bdevs_list": [ 00:13:27.865 { 00:13:27.865 "name": "BaseBdev1", 00:13:27.865 "uuid": "6a0af206-176a-45f5-89bc-d3b2e9a53223", 00:13:27.865 "is_configured": true, 00:13:27.865 "data_offset": 0, 00:13:27.865 "data_size": 65536 00:13:27.865 }, 00:13:27.865 { 00:13:27.865 "name": null, 00:13:27.865 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:27.866 "is_configured": false, 00:13:27.866 "data_offset": 0, 00:13:27.866 "data_size": 65536 00:13:27.866 }, 00:13:27.866 { 00:13:27.866 "name": "BaseBdev3", 00:13:27.866 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:27.866 "is_configured": true, 00:13:27.866 "data_offset": 0, 00:13:27.866 "data_size": 65536 00:13:27.866 } 00:13:27.866 ] 00:13:27.866 }' 00:13:27.866 00:08:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:27.866 00:08:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.431 00:08:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.431 00:08:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:28.689 00:08:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:28.689 00:08:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:29.256 [2024-07-16 00:08:16.027702] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.256 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:29.514 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.514 "name": "Existed_Raid", 00:13:29.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:29.514 "strip_size_kb": 64, 00:13:29.514 "state": "configuring", 00:13:29.514 "raid_level": "raid0", 00:13:29.514 "superblock": false, 00:13:29.514 "num_base_bdevs": 3, 00:13:29.514 "num_base_bdevs_discovered": 1, 00:13:29.514 "num_base_bdevs_operational": 3, 00:13:29.514 "base_bdevs_list": [ 00:13:29.514 { 00:13:29.514 "name": "BaseBdev1", 00:13:29.514 "uuid": "6a0af206-176a-45f5-89bc-d3b2e9a53223", 00:13:29.514 "is_configured": true, 00:13:29.514 "data_offset": 0, 00:13:29.514 "data_size": 65536 00:13:29.514 }, 00:13:29.514 { 00:13:29.514 "name": null, 00:13:29.514 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:29.514 "is_configured": false, 00:13:29.514 "data_offset": 0, 00:13:29.514 "data_size": 65536 00:13:29.514 }, 00:13:29.514 { 00:13:29.514 "name": null, 00:13:29.514 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:29.514 "is_configured": false, 00:13:29.514 "data_offset": 0, 00:13:29.514 "data_size": 65536 00:13:29.514 } 00:13:29.514 ] 00:13:29.514 }' 00:13:29.514 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.514 00:08:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.082 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.082 00:08:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:30.340 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:30.340 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:30.907 [2024-07-16 00:08:17.648015] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.907 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.164 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.164 "name": "Existed_Raid", 00:13:31.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:31.164 "strip_size_kb": 64, 00:13:31.164 "state": "configuring", 00:13:31.164 "raid_level": "raid0", 00:13:31.164 "superblock": false, 00:13:31.164 "num_base_bdevs": 3, 00:13:31.164 "num_base_bdevs_discovered": 2, 00:13:31.164 "num_base_bdevs_operational": 3, 00:13:31.164 "base_bdevs_list": [ 00:13:31.164 { 00:13:31.164 "name": "BaseBdev1", 00:13:31.164 "uuid": "6a0af206-176a-45f5-89bc-d3b2e9a53223", 00:13:31.165 "is_configured": true, 00:13:31.165 "data_offset": 0, 00:13:31.165 "data_size": 65536 00:13:31.165 }, 00:13:31.165 { 00:13:31.165 "name": null, 00:13:31.165 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:31.165 "is_configured": false, 00:13:31.165 "data_offset": 0, 00:13:31.165 "data_size": 65536 00:13:31.165 }, 00:13:31.165 { 00:13:31.165 "name": "BaseBdev3", 00:13:31.165 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:31.165 "is_configured": true, 00:13:31.165 "data_offset": 0, 00:13:31.165 "data_size": 65536 00:13:31.165 } 00:13:31.165 ] 00:13:31.165 }' 00:13:31.165 00:08:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.165 00:08:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.731 00:08:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.731 00:08:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:31.990 00:08:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:31.990 00:08:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:32.249 [2024-07-16 00:08:19.011640] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.249 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:32.508 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:32.508 "name": "Existed_Raid", 00:13:32.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.508 "strip_size_kb": 64, 00:13:32.508 "state": "configuring", 00:13:32.508 "raid_level": "raid0", 00:13:32.508 "superblock": false, 00:13:32.508 "num_base_bdevs": 3, 00:13:32.508 "num_base_bdevs_discovered": 1, 00:13:32.508 "num_base_bdevs_operational": 3, 00:13:32.508 "base_bdevs_list": [ 00:13:32.508 { 00:13:32.508 "name": null, 00:13:32.508 "uuid": "6a0af206-176a-45f5-89bc-d3b2e9a53223", 00:13:32.508 "is_configured": false, 00:13:32.508 "data_offset": 0, 00:13:32.508 "data_size": 65536 00:13:32.508 }, 00:13:32.508 { 00:13:32.508 "name": null, 00:13:32.508 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:32.508 "is_configured": false, 00:13:32.508 "data_offset": 0, 00:13:32.508 "data_size": 65536 00:13:32.508 }, 00:13:32.508 { 00:13:32.508 "name": "BaseBdev3", 00:13:32.508 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:32.508 "is_configured": true, 00:13:32.508 "data_offset": 0, 00:13:32.508 "data_size": 65536 00:13:32.508 } 00:13:32.508 ] 00:13:32.508 }' 00:13:32.508 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:32.508 00:08:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:33.074 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.074 00:08:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:33.390 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:33.390 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:33.649 [2024-07-16 00:08:20.377911] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.649 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:33.907 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:33.907 "name": "Existed_Raid", 00:13:33.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:33.907 "strip_size_kb": 64, 00:13:33.907 "state": "configuring", 00:13:33.907 "raid_level": "raid0", 00:13:33.907 "superblock": false, 00:13:33.907 "num_base_bdevs": 3, 00:13:33.907 "num_base_bdevs_discovered": 2, 00:13:33.907 "num_base_bdevs_operational": 3, 00:13:33.907 "base_bdevs_list": [ 00:13:33.907 { 00:13:33.907 "name": null, 00:13:33.907 "uuid": "6a0af206-176a-45f5-89bc-d3b2e9a53223", 00:13:33.907 "is_configured": false, 00:13:33.907 "data_offset": 0, 00:13:33.907 "data_size": 65536 00:13:33.907 }, 00:13:33.907 { 00:13:33.907 "name": "BaseBdev2", 00:13:33.907 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:33.907 "is_configured": true, 00:13:33.907 "data_offset": 0, 00:13:33.907 "data_size": 65536 00:13:33.907 }, 00:13:33.907 { 00:13:33.907 "name": "BaseBdev3", 00:13:33.907 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:33.907 "is_configured": true, 00:13:33.907 "data_offset": 0, 00:13:33.907 "data_size": 65536 00:13:33.907 } 00:13:33.907 ] 00:13:33.907 }' 00:13:33.907 00:08:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:33.907 00:08:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.473 00:08:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.473 00:08:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:34.473 00:08:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:34.473 00:08:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.473 00:08:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:34.730 00:08:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6a0af206-176a-45f5-89bc-d3b2e9a53223 00:13:34.987 [2024-07-16 00:08:21.842448] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:34.987 [2024-07-16 00:08:21.842488] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2158450 00:13:34.987 [2024-07-16 00:08:21.842497] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:34.987 [2024-07-16 00:08:21.842695] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2159a50 00:13:34.988 [2024-07-16 00:08:21.842810] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2158450 00:13:34.988 [2024-07-16 00:08:21.842820] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2158450 00:13:34.988 [2024-07-16 00:08:21.843013] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:34.988 NewBaseBdev 00:13:34.988 00:08:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:34.988 00:08:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:34.988 00:08:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:34.988 00:08:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:34.988 00:08:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:34.988 00:08:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:34.988 00:08:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:35.245 00:08:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:35.503 [ 00:13:35.503 { 00:13:35.503 "name": "NewBaseBdev", 00:13:35.503 "aliases": [ 00:13:35.503 "6a0af206-176a-45f5-89bc-d3b2e9a53223" 00:13:35.503 ], 00:13:35.503 "product_name": "Malloc disk", 00:13:35.503 "block_size": 512, 00:13:35.503 "num_blocks": 65536, 00:13:35.503 "uuid": "6a0af206-176a-45f5-89bc-d3b2e9a53223", 00:13:35.503 "assigned_rate_limits": { 00:13:35.503 "rw_ios_per_sec": 0, 00:13:35.503 "rw_mbytes_per_sec": 0, 00:13:35.503 "r_mbytes_per_sec": 0, 00:13:35.503 "w_mbytes_per_sec": 0 00:13:35.503 }, 00:13:35.503 "claimed": true, 00:13:35.503 "claim_type": "exclusive_write", 00:13:35.503 "zoned": false, 00:13:35.503 "supported_io_types": { 00:13:35.503 "read": true, 00:13:35.503 "write": true, 00:13:35.503 "unmap": true, 00:13:35.503 "flush": true, 00:13:35.503 "reset": true, 00:13:35.503 "nvme_admin": false, 00:13:35.503 "nvme_io": false, 00:13:35.503 "nvme_io_md": false, 00:13:35.503 "write_zeroes": true, 00:13:35.503 "zcopy": true, 00:13:35.503 "get_zone_info": false, 00:13:35.503 "zone_management": false, 00:13:35.503 "zone_append": false, 00:13:35.503 "compare": false, 00:13:35.503 "compare_and_write": false, 00:13:35.503 "abort": true, 00:13:35.503 "seek_hole": false, 00:13:35.503 "seek_data": false, 00:13:35.503 "copy": true, 00:13:35.503 "nvme_iov_md": false 00:13:35.503 }, 00:13:35.503 "memory_domains": [ 00:13:35.503 { 00:13:35.503 "dma_device_id": "system", 00:13:35.503 "dma_device_type": 1 00:13:35.503 }, 00:13:35.503 { 00:13:35.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.503 "dma_device_type": 2 00:13:35.503 } 00:13:35.503 ], 00:13:35.503 "driver_specific": {} 00:13:35.503 } 00:13:35.503 ] 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.503 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:35.761 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.761 "name": "Existed_Raid", 00:13:35.761 "uuid": "c5eae68b-094d-4f07-9afe-7a5688fd4243", 00:13:35.761 "strip_size_kb": 64, 00:13:35.761 "state": "online", 00:13:35.761 "raid_level": "raid0", 00:13:35.761 "superblock": false, 00:13:35.761 "num_base_bdevs": 3, 00:13:35.761 "num_base_bdevs_discovered": 3, 00:13:35.761 "num_base_bdevs_operational": 3, 00:13:35.761 "base_bdevs_list": [ 00:13:35.761 { 00:13:35.761 "name": "NewBaseBdev", 00:13:35.761 "uuid": "6a0af206-176a-45f5-89bc-d3b2e9a53223", 00:13:35.761 "is_configured": true, 00:13:35.761 "data_offset": 0, 00:13:35.761 "data_size": 65536 00:13:35.761 }, 00:13:35.761 { 00:13:35.761 "name": "BaseBdev2", 00:13:35.761 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:35.761 "is_configured": true, 00:13:35.761 "data_offset": 0, 00:13:35.761 "data_size": 65536 00:13:35.761 }, 00:13:35.761 { 00:13:35.761 "name": "BaseBdev3", 00:13:35.761 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:35.761 "is_configured": true, 00:13:35.761 "data_offset": 0, 00:13:35.761 "data_size": 65536 00:13:35.761 } 00:13:35.761 ] 00:13:35.761 }' 00:13:35.761 00:08:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.761 00:08:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:36.326 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:36.326 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:36.326 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:36.326 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:36.326 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:36.326 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:36.326 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:36.326 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:36.582 [2024-07-16 00:08:23.455015] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:36.582 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:36.582 "name": "Existed_Raid", 00:13:36.582 "aliases": [ 00:13:36.582 "c5eae68b-094d-4f07-9afe-7a5688fd4243" 00:13:36.582 ], 00:13:36.582 "product_name": "Raid Volume", 00:13:36.582 "block_size": 512, 00:13:36.582 "num_blocks": 196608, 00:13:36.582 "uuid": "c5eae68b-094d-4f07-9afe-7a5688fd4243", 00:13:36.582 "assigned_rate_limits": { 00:13:36.582 "rw_ios_per_sec": 0, 00:13:36.582 "rw_mbytes_per_sec": 0, 00:13:36.582 "r_mbytes_per_sec": 0, 00:13:36.582 "w_mbytes_per_sec": 0 00:13:36.582 }, 00:13:36.582 "claimed": false, 00:13:36.582 "zoned": false, 00:13:36.582 "supported_io_types": { 00:13:36.582 "read": true, 00:13:36.582 "write": true, 00:13:36.582 "unmap": true, 00:13:36.582 "flush": true, 00:13:36.582 "reset": true, 00:13:36.582 "nvme_admin": false, 00:13:36.582 "nvme_io": false, 00:13:36.582 "nvme_io_md": false, 00:13:36.582 "write_zeroes": true, 00:13:36.582 "zcopy": false, 00:13:36.582 "get_zone_info": false, 00:13:36.582 "zone_management": false, 00:13:36.582 "zone_append": false, 00:13:36.582 "compare": false, 00:13:36.582 "compare_and_write": false, 00:13:36.582 "abort": false, 00:13:36.582 "seek_hole": false, 00:13:36.582 "seek_data": false, 00:13:36.582 "copy": false, 00:13:36.582 "nvme_iov_md": false 00:13:36.582 }, 00:13:36.582 "memory_domains": [ 00:13:36.582 { 00:13:36.582 "dma_device_id": "system", 00:13:36.582 "dma_device_type": 1 00:13:36.582 }, 00:13:36.582 { 00:13:36.582 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.582 "dma_device_type": 2 00:13:36.582 }, 00:13:36.582 { 00:13:36.582 "dma_device_id": "system", 00:13:36.582 "dma_device_type": 1 00:13:36.582 }, 00:13:36.582 { 00:13:36.582 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.582 "dma_device_type": 2 00:13:36.582 }, 00:13:36.582 { 00:13:36.582 "dma_device_id": "system", 00:13:36.582 "dma_device_type": 1 00:13:36.582 }, 00:13:36.582 { 00:13:36.582 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.582 "dma_device_type": 2 00:13:36.582 } 00:13:36.582 ], 00:13:36.582 "driver_specific": { 00:13:36.582 "raid": { 00:13:36.582 "uuid": "c5eae68b-094d-4f07-9afe-7a5688fd4243", 00:13:36.582 "strip_size_kb": 64, 00:13:36.582 "state": "online", 00:13:36.582 "raid_level": "raid0", 00:13:36.582 "superblock": false, 00:13:36.582 "num_base_bdevs": 3, 00:13:36.582 "num_base_bdevs_discovered": 3, 00:13:36.582 "num_base_bdevs_operational": 3, 00:13:36.582 "base_bdevs_list": [ 00:13:36.582 { 00:13:36.582 "name": "NewBaseBdev", 00:13:36.582 "uuid": "6a0af206-176a-45f5-89bc-d3b2e9a53223", 00:13:36.582 "is_configured": true, 00:13:36.582 "data_offset": 0, 00:13:36.582 "data_size": 65536 00:13:36.582 }, 00:13:36.582 { 00:13:36.582 "name": "BaseBdev2", 00:13:36.582 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:36.582 "is_configured": true, 00:13:36.582 "data_offset": 0, 00:13:36.582 "data_size": 65536 00:13:36.582 }, 00:13:36.582 { 00:13:36.582 "name": "BaseBdev3", 00:13:36.582 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:36.582 "is_configured": true, 00:13:36.582 "data_offset": 0, 00:13:36.582 "data_size": 65536 00:13:36.582 } 00:13:36.582 ] 00:13:36.582 } 00:13:36.582 } 00:13:36.582 }' 00:13:36.582 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:36.582 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:36.582 BaseBdev2 00:13:36.582 BaseBdev3' 00:13:36.582 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:36.582 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:36.582 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:36.838 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:36.838 "name": "NewBaseBdev", 00:13:36.838 "aliases": [ 00:13:36.838 "6a0af206-176a-45f5-89bc-d3b2e9a53223" 00:13:36.838 ], 00:13:36.838 "product_name": "Malloc disk", 00:13:36.838 "block_size": 512, 00:13:36.838 "num_blocks": 65536, 00:13:36.838 "uuid": "6a0af206-176a-45f5-89bc-d3b2e9a53223", 00:13:36.838 "assigned_rate_limits": { 00:13:36.838 "rw_ios_per_sec": 0, 00:13:36.838 "rw_mbytes_per_sec": 0, 00:13:36.838 "r_mbytes_per_sec": 0, 00:13:36.838 "w_mbytes_per_sec": 0 00:13:36.838 }, 00:13:36.838 "claimed": true, 00:13:36.838 "claim_type": "exclusive_write", 00:13:36.838 "zoned": false, 00:13:36.838 "supported_io_types": { 00:13:36.838 "read": true, 00:13:36.838 "write": true, 00:13:36.838 "unmap": true, 00:13:36.838 "flush": true, 00:13:36.839 "reset": true, 00:13:36.839 "nvme_admin": false, 00:13:36.839 "nvme_io": false, 00:13:36.839 "nvme_io_md": false, 00:13:36.839 "write_zeroes": true, 00:13:36.839 "zcopy": true, 00:13:36.839 "get_zone_info": false, 00:13:36.839 "zone_management": false, 00:13:36.839 "zone_append": false, 00:13:36.839 "compare": false, 00:13:36.839 "compare_and_write": false, 00:13:36.839 "abort": true, 00:13:36.839 "seek_hole": false, 00:13:36.839 "seek_data": false, 00:13:36.839 "copy": true, 00:13:36.839 "nvme_iov_md": false 00:13:36.839 }, 00:13:36.839 "memory_domains": [ 00:13:36.839 { 00:13:36.839 "dma_device_id": "system", 00:13:36.839 "dma_device_type": 1 00:13:36.839 }, 00:13:36.839 { 00:13:36.839 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.839 "dma_device_type": 2 00:13:36.839 } 00:13:36.839 ], 00:13:36.839 "driver_specific": {} 00:13:36.839 }' 00:13:36.839 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.096 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.096 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:37.096 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.096 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.096 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:37.096 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.096 00:08:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.096 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:37.096 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.354 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.354 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:37.354 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:37.354 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:37.354 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:37.611 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:37.611 "name": "BaseBdev2", 00:13:37.611 "aliases": [ 00:13:37.611 "eaaede6b-e793-4b6a-9d8e-692516c73db0" 00:13:37.611 ], 00:13:37.612 "product_name": "Malloc disk", 00:13:37.612 "block_size": 512, 00:13:37.612 "num_blocks": 65536, 00:13:37.612 "uuid": "eaaede6b-e793-4b6a-9d8e-692516c73db0", 00:13:37.612 "assigned_rate_limits": { 00:13:37.612 "rw_ios_per_sec": 0, 00:13:37.612 "rw_mbytes_per_sec": 0, 00:13:37.612 "r_mbytes_per_sec": 0, 00:13:37.612 "w_mbytes_per_sec": 0 00:13:37.612 }, 00:13:37.612 "claimed": true, 00:13:37.612 "claim_type": "exclusive_write", 00:13:37.612 "zoned": false, 00:13:37.612 "supported_io_types": { 00:13:37.612 "read": true, 00:13:37.612 "write": true, 00:13:37.612 "unmap": true, 00:13:37.612 "flush": true, 00:13:37.612 "reset": true, 00:13:37.612 "nvme_admin": false, 00:13:37.612 "nvme_io": false, 00:13:37.612 "nvme_io_md": false, 00:13:37.612 "write_zeroes": true, 00:13:37.612 "zcopy": true, 00:13:37.612 "get_zone_info": false, 00:13:37.612 "zone_management": false, 00:13:37.612 "zone_append": false, 00:13:37.612 "compare": false, 00:13:37.612 "compare_and_write": false, 00:13:37.612 "abort": true, 00:13:37.612 "seek_hole": false, 00:13:37.612 "seek_data": false, 00:13:37.612 "copy": true, 00:13:37.612 "nvme_iov_md": false 00:13:37.612 }, 00:13:37.612 "memory_domains": [ 00:13:37.612 { 00:13:37.612 "dma_device_id": "system", 00:13:37.612 "dma_device_type": 1 00:13:37.612 }, 00:13:37.612 { 00:13:37.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.612 "dma_device_type": 2 00:13:37.612 } 00:13:37.612 ], 00:13:37.612 "driver_specific": {} 00:13:37.612 }' 00:13:37.612 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.612 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.612 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:37.612 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.612 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.612 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:37.612 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.870 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.870 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:37.870 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.870 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.870 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:37.870 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:37.870 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:37.870 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:38.127 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:38.127 "name": "BaseBdev3", 00:13:38.127 "aliases": [ 00:13:38.127 "85a604c9-da87-4dd7-8b05-4cf8f276044f" 00:13:38.127 ], 00:13:38.127 "product_name": "Malloc disk", 00:13:38.127 "block_size": 512, 00:13:38.127 "num_blocks": 65536, 00:13:38.127 "uuid": "85a604c9-da87-4dd7-8b05-4cf8f276044f", 00:13:38.127 "assigned_rate_limits": { 00:13:38.127 "rw_ios_per_sec": 0, 00:13:38.127 "rw_mbytes_per_sec": 0, 00:13:38.127 "r_mbytes_per_sec": 0, 00:13:38.127 "w_mbytes_per_sec": 0 00:13:38.127 }, 00:13:38.127 "claimed": true, 00:13:38.127 "claim_type": "exclusive_write", 00:13:38.127 "zoned": false, 00:13:38.127 "supported_io_types": { 00:13:38.127 "read": true, 00:13:38.127 "write": true, 00:13:38.127 "unmap": true, 00:13:38.127 "flush": true, 00:13:38.127 "reset": true, 00:13:38.127 "nvme_admin": false, 00:13:38.127 "nvme_io": false, 00:13:38.127 "nvme_io_md": false, 00:13:38.127 "write_zeroes": true, 00:13:38.127 "zcopy": true, 00:13:38.127 "get_zone_info": false, 00:13:38.127 "zone_management": false, 00:13:38.127 "zone_append": false, 00:13:38.127 "compare": false, 00:13:38.127 "compare_and_write": false, 00:13:38.127 "abort": true, 00:13:38.127 "seek_hole": false, 00:13:38.127 "seek_data": false, 00:13:38.127 "copy": true, 00:13:38.127 "nvme_iov_md": false 00:13:38.127 }, 00:13:38.127 "memory_domains": [ 00:13:38.127 { 00:13:38.127 "dma_device_id": "system", 00:13:38.127 "dma_device_type": 1 00:13:38.127 }, 00:13:38.127 { 00:13:38.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.127 "dma_device_type": 2 00:13:38.127 } 00:13:38.127 ], 00:13:38.127 "driver_specific": {} 00:13:38.127 }' 00:13:38.127 00:08:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.127 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.127 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:38.127 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.384 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.384 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:38.384 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.384 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.384 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:38.384 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:38.384 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:38.384 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:38.384 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:38.641 [2024-07-16 00:08:25.524235] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:38.641 [2024-07-16 00:08:25.524263] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:38.641 [2024-07-16 00:08:25.524316] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:38.641 [2024-07-16 00:08:25.524367] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:38.642 [2024-07-16 00:08:25.524378] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2158450 name Existed_Raid, state offline 00:13:38.642 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3508820 00:13:38.642 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3508820 ']' 00:13:38.642 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3508820 00:13:38.642 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:38.642 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:38.642 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3508820 00:13:38.899 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:38.899 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:38.899 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3508820' 00:13:38.899 killing process with pid 3508820 00:13:38.899 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3508820 00:13:38.899 [2024-07-16 00:08:25.594216] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:38.899 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3508820 00:13:38.899 [2024-07-16 00:08:25.620115] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:38.899 00:08:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:38.899 00:13:38.899 real 0m29.996s 00:13:38.899 user 0m55.028s 00:13:38.899 sys 0m5.367s 00:13:38.899 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:38.899 00:08:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.899 ************************************ 00:13:38.899 END TEST raid_state_function_test 00:13:38.899 ************************************ 00:13:39.158 00:08:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:39.158 00:08:25 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:13:39.158 00:08:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:39.158 00:08:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:39.158 00:08:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:39.158 ************************************ 00:13:39.158 START TEST raid_state_function_test_sb 00:13:39.158 ************************************ 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3513341 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3513341' 00:13:39.158 Process raid pid: 3513341 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3513341 /var/tmp/spdk-raid.sock 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3513341 ']' 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:39.158 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:39.158 00:08:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:39.158 [2024-07-16 00:08:25.980420] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:13:39.158 [2024-07-16 00:08:25.980475] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:39.158 [2024-07-16 00:08:26.095533] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:39.416 [2024-07-16 00:08:26.197387] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:39.416 [2024-07-16 00:08:26.257268] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:39.416 [2024-07-16 00:08:26.257305] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:40.350 00:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:40.350 00:08:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:40.350 00:08:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:40.350 [2024-07-16 00:08:27.163725] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:40.350 [2024-07-16 00:08:27.163772] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:40.350 [2024-07-16 00:08:27.163783] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:40.350 [2024-07-16 00:08:27.163796] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:40.350 [2024-07-16 00:08:27.163804] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:40.350 [2024-07-16 00:08:27.163816] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.350 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:40.609 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.609 "name": "Existed_Raid", 00:13:40.609 "uuid": "637415de-2693-4352-a666-7678456fdf7b", 00:13:40.609 "strip_size_kb": 64, 00:13:40.609 "state": "configuring", 00:13:40.609 "raid_level": "raid0", 00:13:40.609 "superblock": true, 00:13:40.609 "num_base_bdevs": 3, 00:13:40.609 "num_base_bdevs_discovered": 0, 00:13:40.609 "num_base_bdevs_operational": 3, 00:13:40.609 "base_bdevs_list": [ 00:13:40.609 { 00:13:40.609 "name": "BaseBdev1", 00:13:40.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.609 "is_configured": false, 00:13:40.609 "data_offset": 0, 00:13:40.609 "data_size": 0 00:13:40.609 }, 00:13:40.609 { 00:13:40.609 "name": "BaseBdev2", 00:13:40.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.609 "is_configured": false, 00:13:40.609 "data_offset": 0, 00:13:40.609 "data_size": 0 00:13:40.609 }, 00:13:40.609 { 00:13:40.609 "name": "BaseBdev3", 00:13:40.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.609 "is_configured": false, 00:13:40.609 "data_offset": 0, 00:13:40.609 "data_size": 0 00:13:40.609 } 00:13:40.609 ] 00:13:40.609 }' 00:13:40.609 00:08:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.609 00:08:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:41.176 00:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:41.435 [2024-07-16 00:08:28.302582] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:41.436 [2024-07-16 00:08:28.302613] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x889a80 name Existed_Raid, state configuring 00:13:41.436 00:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:41.695 [2024-07-16 00:08:28.551278] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:41.695 [2024-07-16 00:08:28.551309] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:41.695 [2024-07-16 00:08:28.551318] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:41.695 [2024-07-16 00:08:28.551330] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:41.695 [2024-07-16 00:08:28.551339] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:41.695 [2024-07-16 00:08:28.551350] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:41.695 00:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:41.954 [2024-07-16 00:08:28.825912] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:41.954 BaseBdev1 00:13:41.954 00:08:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:41.954 00:08:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:41.954 00:08:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:41.954 00:08:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:41.954 00:08:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:41.954 00:08:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:41.954 00:08:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:42.212 00:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:42.470 [ 00:13:42.470 { 00:13:42.470 "name": "BaseBdev1", 00:13:42.470 "aliases": [ 00:13:42.470 "904a1b04-8242-4434-8568-81355f45a001" 00:13:42.470 ], 00:13:42.470 "product_name": "Malloc disk", 00:13:42.470 "block_size": 512, 00:13:42.470 "num_blocks": 65536, 00:13:42.470 "uuid": "904a1b04-8242-4434-8568-81355f45a001", 00:13:42.470 "assigned_rate_limits": { 00:13:42.470 "rw_ios_per_sec": 0, 00:13:42.470 "rw_mbytes_per_sec": 0, 00:13:42.470 "r_mbytes_per_sec": 0, 00:13:42.470 "w_mbytes_per_sec": 0 00:13:42.470 }, 00:13:42.470 "claimed": true, 00:13:42.470 "claim_type": "exclusive_write", 00:13:42.470 "zoned": false, 00:13:42.470 "supported_io_types": { 00:13:42.470 "read": true, 00:13:42.470 "write": true, 00:13:42.470 "unmap": true, 00:13:42.470 "flush": true, 00:13:42.470 "reset": true, 00:13:42.470 "nvme_admin": false, 00:13:42.470 "nvme_io": false, 00:13:42.470 "nvme_io_md": false, 00:13:42.470 "write_zeroes": true, 00:13:42.470 "zcopy": true, 00:13:42.470 "get_zone_info": false, 00:13:42.470 "zone_management": false, 00:13:42.470 "zone_append": false, 00:13:42.470 "compare": false, 00:13:42.470 "compare_and_write": false, 00:13:42.470 "abort": true, 00:13:42.470 "seek_hole": false, 00:13:42.470 "seek_data": false, 00:13:42.470 "copy": true, 00:13:42.470 "nvme_iov_md": false 00:13:42.470 }, 00:13:42.470 "memory_domains": [ 00:13:42.470 { 00:13:42.470 "dma_device_id": "system", 00:13:42.470 "dma_device_type": 1 00:13:42.470 }, 00:13:42.470 { 00:13:42.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.470 "dma_device_type": 2 00:13:42.470 } 00:13:42.470 ], 00:13:42.470 "driver_specific": {} 00:13:42.470 } 00:13:42.470 ] 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.470 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:42.729 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.729 "name": "Existed_Raid", 00:13:42.729 "uuid": "5f4abcb1-92ed-481e-b856-e53dee4fffe3", 00:13:42.729 "strip_size_kb": 64, 00:13:42.729 "state": "configuring", 00:13:42.729 "raid_level": "raid0", 00:13:42.729 "superblock": true, 00:13:42.729 "num_base_bdevs": 3, 00:13:42.729 "num_base_bdevs_discovered": 1, 00:13:42.729 "num_base_bdevs_operational": 3, 00:13:42.729 "base_bdevs_list": [ 00:13:42.729 { 00:13:42.729 "name": "BaseBdev1", 00:13:42.729 "uuid": "904a1b04-8242-4434-8568-81355f45a001", 00:13:42.729 "is_configured": true, 00:13:42.729 "data_offset": 2048, 00:13:42.729 "data_size": 63488 00:13:42.729 }, 00:13:42.729 { 00:13:42.729 "name": "BaseBdev2", 00:13:42.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:42.729 "is_configured": false, 00:13:42.729 "data_offset": 0, 00:13:42.729 "data_size": 0 00:13:42.729 }, 00:13:42.729 { 00:13:42.729 "name": "BaseBdev3", 00:13:42.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:42.729 "is_configured": false, 00:13:42.729 "data_offset": 0, 00:13:42.729 "data_size": 0 00:13:42.729 } 00:13:42.729 ] 00:13:42.729 }' 00:13:42.729 00:08:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.729 00:08:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:43.319 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:43.578 [2024-07-16 00:08:30.378050] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:43.578 [2024-07-16 00:08:30.378096] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x889310 name Existed_Raid, state configuring 00:13:43.578 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:43.836 [2024-07-16 00:08:30.618722] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:43.837 [2024-07-16 00:08:30.620166] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:43.837 [2024-07-16 00:08:30.620202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:43.837 [2024-07-16 00:08:30.620212] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:43.837 [2024-07-16 00:08:30.620224] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.837 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.095 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.095 "name": "Existed_Raid", 00:13:44.095 "uuid": "d559fd14-31ea-4f50-90ea-5cb494bce374", 00:13:44.095 "strip_size_kb": 64, 00:13:44.095 "state": "configuring", 00:13:44.095 "raid_level": "raid0", 00:13:44.095 "superblock": true, 00:13:44.095 "num_base_bdevs": 3, 00:13:44.095 "num_base_bdevs_discovered": 1, 00:13:44.095 "num_base_bdevs_operational": 3, 00:13:44.095 "base_bdevs_list": [ 00:13:44.095 { 00:13:44.095 "name": "BaseBdev1", 00:13:44.095 "uuid": "904a1b04-8242-4434-8568-81355f45a001", 00:13:44.095 "is_configured": true, 00:13:44.095 "data_offset": 2048, 00:13:44.095 "data_size": 63488 00:13:44.095 }, 00:13:44.095 { 00:13:44.095 "name": "BaseBdev2", 00:13:44.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.095 "is_configured": false, 00:13:44.095 "data_offset": 0, 00:13:44.095 "data_size": 0 00:13:44.095 }, 00:13:44.095 { 00:13:44.095 "name": "BaseBdev3", 00:13:44.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.095 "is_configured": false, 00:13:44.095 "data_offset": 0, 00:13:44.095 "data_size": 0 00:13:44.095 } 00:13:44.095 ] 00:13:44.095 }' 00:13:44.095 00:08:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.095 00:08:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:44.659 00:08:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:44.918 [2024-07-16 00:08:31.717073] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:44.918 BaseBdev2 00:13:44.918 00:08:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:44.918 00:08:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:44.918 00:08:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:44.918 00:08:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:44.918 00:08:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:44.918 00:08:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:44.918 00:08:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:45.177 00:08:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:45.436 [ 00:13:45.436 { 00:13:45.436 "name": "BaseBdev2", 00:13:45.436 "aliases": [ 00:13:45.436 "d1beb451-2b43-42a0-b70b-ea18de3a85fb" 00:13:45.436 ], 00:13:45.436 "product_name": "Malloc disk", 00:13:45.436 "block_size": 512, 00:13:45.436 "num_blocks": 65536, 00:13:45.436 "uuid": "d1beb451-2b43-42a0-b70b-ea18de3a85fb", 00:13:45.436 "assigned_rate_limits": { 00:13:45.436 "rw_ios_per_sec": 0, 00:13:45.436 "rw_mbytes_per_sec": 0, 00:13:45.436 "r_mbytes_per_sec": 0, 00:13:45.437 "w_mbytes_per_sec": 0 00:13:45.437 }, 00:13:45.437 "claimed": true, 00:13:45.437 "claim_type": "exclusive_write", 00:13:45.437 "zoned": false, 00:13:45.437 "supported_io_types": { 00:13:45.437 "read": true, 00:13:45.437 "write": true, 00:13:45.437 "unmap": true, 00:13:45.437 "flush": true, 00:13:45.437 "reset": true, 00:13:45.437 "nvme_admin": false, 00:13:45.437 "nvme_io": false, 00:13:45.437 "nvme_io_md": false, 00:13:45.437 "write_zeroes": true, 00:13:45.437 "zcopy": true, 00:13:45.437 "get_zone_info": false, 00:13:45.437 "zone_management": false, 00:13:45.437 "zone_append": false, 00:13:45.437 "compare": false, 00:13:45.437 "compare_and_write": false, 00:13:45.437 "abort": true, 00:13:45.437 "seek_hole": false, 00:13:45.437 "seek_data": false, 00:13:45.437 "copy": true, 00:13:45.437 "nvme_iov_md": false 00:13:45.437 }, 00:13:45.437 "memory_domains": [ 00:13:45.437 { 00:13:45.437 "dma_device_id": "system", 00:13:45.437 "dma_device_type": 1 00:13:45.437 }, 00:13:45.437 { 00:13:45.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.437 "dma_device_type": 2 00:13:45.437 } 00:13:45.437 ], 00:13:45.437 "driver_specific": {} 00:13:45.437 } 00:13:45.437 ] 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.437 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.696 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.696 "name": "Existed_Raid", 00:13:45.696 "uuid": "d559fd14-31ea-4f50-90ea-5cb494bce374", 00:13:45.696 "strip_size_kb": 64, 00:13:45.696 "state": "configuring", 00:13:45.696 "raid_level": "raid0", 00:13:45.696 "superblock": true, 00:13:45.696 "num_base_bdevs": 3, 00:13:45.696 "num_base_bdevs_discovered": 2, 00:13:45.696 "num_base_bdevs_operational": 3, 00:13:45.696 "base_bdevs_list": [ 00:13:45.696 { 00:13:45.696 "name": "BaseBdev1", 00:13:45.696 "uuid": "904a1b04-8242-4434-8568-81355f45a001", 00:13:45.696 "is_configured": true, 00:13:45.696 "data_offset": 2048, 00:13:45.696 "data_size": 63488 00:13:45.696 }, 00:13:45.696 { 00:13:45.696 "name": "BaseBdev2", 00:13:45.696 "uuid": "d1beb451-2b43-42a0-b70b-ea18de3a85fb", 00:13:45.696 "is_configured": true, 00:13:45.696 "data_offset": 2048, 00:13:45.696 "data_size": 63488 00:13:45.696 }, 00:13:45.696 { 00:13:45.696 "name": "BaseBdev3", 00:13:45.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.696 "is_configured": false, 00:13:45.696 "data_offset": 0, 00:13:45.696 "data_size": 0 00:13:45.696 } 00:13:45.696 ] 00:13:45.696 }' 00:13:45.696 00:08:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.696 00:08:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:46.269 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:46.526 [2024-07-16 00:08:33.344718] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:46.526 [2024-07-16 00:08:33.344890] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x88a400 00:13:46.526 [2024-07-16 00:08:33.344904] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:46.526 [2024-07-16 00:08:33.345089] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x889ef0 00:13:46.526 [2024-07-16 00:08:33.345204] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x88a400 00:13:46.526 [2024-07-16 00:08:33.345214] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x88a400 00:13:46.526 [2024-07-16 00:08:33.345311] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:46.526 BaseBdev3 00:13:46.526 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:46.526 00:08:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:46.526 00:08:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:46.526 00:08:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:46.526 00:08:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:46.526 00:08:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:46.526 00:08:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:46.783 00:08:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:47.042 [ 00:13:47.042 { 00:13:47.042 "name": "BaseBdev3", 00:13:47.042 "aliases": [ 00:13:47.042 "9095e39b-8e7e-47a4-a6d8-28aa3971888d" 00:13:47.042 ], 00:13:47.042 "product_name": "Malloc disk", 00:13:47.042 "block_size": 512, 00:13:47.042 "num_blocks": 65536, 00:13:47.042 "uuid": "9095e39b-8e7e-47a4-a6d8-28aa3971888d", 00:13:47.042 "assigned_rate_limits": { 00:13:47.042 "rw_ios_per_sec": 0, 00:13:47.042 "rw_mbytes_per_sec": 0, 00:13:47.042 "r_mbytes_per_sec": 0, 00:13:47.042 "w_mbytes_per_sec": 0 00:13:47.042 }, 00:13:47.042 "claimed": true, 00:13:47.042 "claim_type": "exclusive_write", 00:13:47.042 "zoned": false, 00:13:47.042 "supported_io_types": { 00:13:47.042 "read": true, 00:13:47.042 "write": true, 00:13:47.042 "unmap": true, 00:13:47.042 "flush": true, 00:13:47.042 "reset": true, 00:13:47.042 "nvme_admin": false, 00:13:47.042 "nvme_io": false, 00:13:47.042 "nvme_io_md": false, 00:13:47.042 "write_zeroes": true, 00:13:47.042 "zcopy": true, 00:13:47.042 "get_zone_info": false, 00:13:47.042 "zone_management": false, 00:13:47.042 "zone_append": false, 00:13:47.042 "compare": false, 00:13:47.042 "compare_and_write": false, 00:13:47.042 "abort": true, 00:13:47.042 "seek_hole": false, 00:13:47.042 "seek_data": false, 00:13:47.042 "copy": true, 00:13:47.042 "nvme_iov_md": false 00:13:47.042 }, 00:13:47.042 "memory_domains": [ 00:13:47.042 { 00:13:47.042 "dma_device_id": "system", 00:13:47.042 "dma_device_type": 1 00:13:47.042 }, 00:13:47.042 { 00:13:47.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.042 "dma_device_type": 2 00:13:47.042 } 00:13:47.042 ], 00:13:47.042 "driver_specific": {} 00:13:47.042 } 00:13:47.042 ] 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.042 00:08:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.300 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.301 "name": "Existed_Raid", 00:13:47.301 "uuid": "d559fd14-31ea-4f50-90ea-5cb494bce374", 00:13:47.301 "strip_size_kb": 64, 00:13:47.301 "state": "online", 00:13:47.301 "raid_level": "raid0", 00:13:47.301 "superblock": true, 00:13:47.301 "num_base_bdevs": 3, 00:13:47.301 "num_base_bdevs_discovered": 3, 00:13:47.301 "num_base_bdevs_operational": 3, 00:13:47.301 "base_bdevs_list": [ 00:13:47.301 { 00:13:47.301 "name": "BaseBdev1", 00:13:47.301 "uuid": "904a1b04-8242-4434-8568-81355f45a001", 00:13:47.301 "is_configured": true, 00:13:47.301 "data_offset": 2048, 00:13:47.301 "data_size": 63488 00:13:47.301 }, 00:13:47.301 { 00:13:47.301 "name": "BaseBdev2", 00:13:47.301 "uuid": "d1beb451-2b43-42a0-b70b-ea18de3a85fb", 00:13:47.301 "is_configured": true, 00:13:47.301 "data_offset": 2048, 00:13:47.301 "data_size": 63488 00:13:47.301 }, 00:13:47.301 { 00:13:47.301 "name": "BaseBdev3", 00:13:47.301 "uuid": "9095e39b-8e7e-47a4-a6d8-28aa3971888d", 00:13:47.301 "is_configured": true, 00:13:47.301 "data_offset": 2048, 00:13:47.301 "data_size": 63488 00:13:47.301 } 00:13:47.301 ] 00:13:47.301 }' 00:13:47.301 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.301 00:08:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:47.868 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:47.868 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:47.868 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:47.868 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:47.868 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:47.868 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:47.868 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:47.868 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:48.125 [2024-07-16 00:08:34.969325] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:48.125 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:48.125 "name": "Existed_Raid", 00:13:48.125 "aliases": [ 00:13:48.125 "d559fd14-31ea-4f50-90ea-5cb494bce374" 00:13:48.125 ], 00:13:48.125 "product_name": "Raid Volume", 00:13:48.125 "block_size": 512, 00:13:48.125 "num_blocks": 190464, 00:13:48.125 "uuid": "d559fd14-31ea-4f50-90ea-5cb494bce374", 00:13:48.125 "assigned_rate_limits": { 00:13:48.125 "rw_ios_per_sec": 0, 00:13:48.125 "rw_mbytes_per_sec": 0, 00:13:48.125 "r_mbytes_per_sec": 0, 00:13:48.125 "w_mbytes_per_sec": 0 00:13:48.125 }, 00:13:48.125 "claimed": false, 00:13:48.125 "zoned": false, 00:13:48.125 "supported_io_types": { 00:13:48.125 "read": true, 00:13:48.125 "write": true, 00:13:48.125 "unmap": true, 00:13:48.125 "flush": true, 00:13:48.125 "reset": true, 00:13:48.125 "nvme_admin": false, 00:13:48.125 "nvme_io": false, 00:13:48.125 "nvme_io_md": false, 00:13:48.125 "write_zeroes": true, 00:13:48.125 "zcopy": false, 00:13:48.125 "get_zone_info": false, 00:13:48.125 "zone_management": false, 00:13:48.125 "zone_append": false, 00:13:48.125 "compare": false, 00:13:48.125 "compare_and_write": false, 00:13:48.125 "abort": false, 00:13:48.125 "seek_hole": false, 00:13:48.125 "seek_data": false, 00:13:48.125 "copy": false, 00:13:48.125 "nvme_iov_md": false 00:13:48.125 }, 00:13:48.125 "memory_domains": [ 00:13:48.125 { 00:13:48.125 "dma_device_id": "system", 00:13:48.125 "dma_device_type": 1 00:13:48.125 }, 00:13:48.125 { 00:13:48.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.125 "dma_device_type": 2 00:13:48.125 }, 00:13:48.125 { 00:13:48.125 "dma_device_id": "system", 00:13:48.125 "dma_device_type": 1 00:13:48.125 }, 00:13:48.125 { 00:13:48.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.125 "dma_device_type": 2 00:13:48.125 }, 00:13:48.125 { 00:13:48.125 "dma_device_id": "system", 00:13:48.125 "dma_device_type": 1 00:13:48.125 }, 00:13:48.125 { 00:13:48.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.125 "dma_device_type": 2 00:13:48.125 } 00:13:48.125 ], 00:13:48.125 "driver_specific": { 00:13:48.125 "raid": { 00:13:48.125 "uuid": "d559fd14-31ea-4f50-90ea-5cb494bce374", 00:13:48.125 "strip_size_kb": 64, 00:13:48.125 "state": "online", 00:13:48.125 "raid_level": "raid0", 00:13:48.125 "superblock": true, 00:13:48.125 "num_base_bdevs": 3, 00:13:48.125 "num_base_bdevs_discovered": 3, 00:13:48.125 "num_base_bdevs_operational": 3, 00:13:48.125 "base_bdevs_list": [ 00:13:48.125 { 00:13:48.125 "name": "BaseBdev1", 00:13:48.125 "uuid": "904a1b04-8242-4434-8568-81355f45a001", 00:13:48.125 "is_configured": true, 00:13:48.125 "data_offset": 2048, 00:13:48.125 "data_size": 63488 00:13:48.125 }, 00:13:48.125 { 00:13:48.125 "name": "BaseBdev2", 00:13:48.125 "uuid": "d1beb451-2b43-42a0-b70b-ea18de3a85fb", 00:13:48.125 "is_configured": true, 00:13:48.125 "data_offset": 2048, 00:13:48.125 "data_size": 63488 00:13:48.125 }, 00:13:48.125 { 00:13:48.125 "name": "BaseBdev3", 00:13:48.125 "uuid": "9095e39b-8e7e-47a4-a6d8-28aa3971888d", 00:13:48.125 "is_configured": true, 00:13:48.125 "data_offset": 2048, 00:13:48.125 "data_size": 63488 00:13:48.125 } 00:13:48.125 ] 00:13:48.125 } 00:13:48.125 } 00:13:48.125 }' 00:13:48.125 00:08:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:48.125 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:48.125 BaseBdev2 00:13:48.125 BaseBdev3' 00:13:48.125 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:48.125 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:48.125 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:48.383 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:48.383 "name": "BaseBdev1", 00:13:48.383 "aliases": [ 00:13:48.383 "904a1b04-8242-4434-8568-81355f45a001" 00:13:48.383 ], 00:13:48.383 "product_name": "Malloc disk", 00:13:48.383 "block_size": 512, 00:13:48.383 "num_blocks": 65536, 00:13:48.383 "uuid": "904a1b04-8242-4434-8568-81355f45a001", 00:13:48.383 "assigned_rate_limits": { 00:13:48.383 "rw_ios_per_sec": 0, 00:13:48.383 "rw_mbytes_per_sec": 0, 00:13:48.383 "r_mbytes_per_sec": 0, 00:13:48.383 "w_mbytes_per_sec": 0 00:13:48.383 }, 00:13:48.383 "claimed": true, 00:13:48.383 "claim_type": "exclusive_write", 00:13:48.383 "zoned": false, 00:13:48.383 "supported_io_types": { 00:13:48.383 "read": true, 00:13:48.383 "write": true, 00:13:48.383 "unmap": true, 00:13:48.383 "flush": true, 00:13:48.383 "reset": true, 00:13:48.383 "nvme_admin": false, 00:13:48.383 "nvme_io": false, 00:13:48.383 "nvme_io_md": false, 00:13:48.383 "write_zeroes": true, 00:13:48.383 "zcopy": true, 00:13:48.383 "get_zone_info": false, 00:13:48.383 "zone_management": false, 00:13:48.383 "zone_append": false, 00:13:48.383 "compare": false, 00:13:48.383 "compare_and_write": false, 00:13:48.383 "abort": true, 00:13:48.383 "seek_hole": false, 00:13:48.383 "seek_data": false, 00:13:48.383 "copy": true, 00:13:48.383 "nvme_iov_md": false 00:13:48.383 }, 00:13:48.383 "memory_domains": [ 00:13:48.383 { 00:13:48.383 "dma_device_id": "system", 00:13:48.383 "dma_device_type": 1 00:13:48.383 }, 00:13:48.383 { 00:13:48.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.383 "dma_device_type": 2 00:13:48.383 } 00:13:48.383 ], 00:13:48.383 "driver_specific": {} 00:13:48.383 }' 00:13:48.383 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.383 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.641 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:48.641 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:48.641 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:48.641 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:48.641 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:48.641 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:48.641 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:48.641 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:48.641 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:48.899 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:48.899 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:48.899 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:48.899 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:49.158 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:49.158 "name": "BaseBdev2", 00:13:49.158 "aliases": [ 00:13:49.158 "d1beb451-2b43-42a0-b70b-ea18de3a85fb" 00:13:49.158 ], 00:13:49.158 "product_name": "Malloc disk", 00:13:49.158 "block_size": 512, 00:13:49.158 "num_blocks": 65536, 00:13:49.158 "uuid": "d1beb451-2b43-42a0-b70b-ea18de3a85fb", 00:13:49.158 "assigned_rate_limits": { 00:13:49.158 "rw_ios_per_sec": 0, 00:13:49.158 "rw_mbytes_per_sec": 0, 00:13:49.158 "r_mbytes_per_sec": 0, 00:13:49.158 "w_mbytes_per_sec": 0 00:13:49.158 }, 00:13:49.158 "claimed": true, 00:13:49.158 "claim_type": "exclusive_write", 00:13:49.158 "zoned": false, 00:13:49.158 "supported_io_types": { 00:13:49.158 "read": true, 00:13:49.158 "write": true, 00:13:49.158 "unmap": true, 00:13:49.158 "flush": true, 00:13:49.158 "reset": true, 00:13:49.158 "nvme_admin": false, 00:13:49.158 "nvme_io": false, 00:13:49.158 "nvme_io_md": false, 00:13:49.158 "write_zeroes": true, 00:13:49.158 "zcopy": true, 00:13:49.158 "get_zone_info": false, 00:13:49.158 "zone_management": false, 00:13:49.158 "zone_append": false, 00:13:49.158 "compare": false, 00:13:49.158 "compare_and_write": false, 00:13:49.158 "abort": true, 00:13:49.158 "seek_hole": false, 00:13:49.158 "seek_data": false, 00:13:49.158 "copy": true, 00:13:49.158 "nvme_iov_md": false 00:13:49.158 }, 00:13:49.158 "memory_domains": [ 00:13:49.158 { 00:13:49.158 "dma_device_id": "system", 00:13:49.158 "dma_device_type": 1 00:13:49.158 }, 00:13:49.158 { 00:13:49.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.158 "dma_device_type": 2 00:13:49.158 } 00:13:49.158 ], 00:13:49.158 "driver_specific": {} 00:13:49.158 }' 00:13:49.158 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.158 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.158 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:49.158 00:08:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.158 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.158 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.158 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.158 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.440 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.440 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.440 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.441 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:49.441 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:49.441 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:49.441 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:49.723 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:49.723 "name": "BaseBdev3", 00:13:49.723 "aliases": [ 00:13:49.723 "9095e39b-8e7e-47a4-a6d8-28aa3971888d" 00:13:49.723 ], 00:13:49.723 "product_name": "Malloc disk", 00:13:49.723 "block_size": 512, 00:13:49.723 "num_blocks": 65536, 00:13:49.723 "uuid": "9095e39b-8e7e-47a4-a6d8-28aa3971888d", 00:13:49.723 "assigned_rate_limits": { 00:13:49.723 "rw_ios_per_sec": 0, 00:13:49.723 "rw_mbytes_per_sec": 0, 00:13:49.723 "r_mbytes_per_sec": 0, 00:13:49.723 "w_mbytes_per_sec": 0 00:13:49.723 }, 00:13:49.723 "claimed": true, 00:13:49.723 "claim_type": "exclusive_write", 00:13:49.723 "zoned": false, 00:13:49.723 "supported_io_types": { 00:13:49.723 "read": true, 00:13:49.723 "write": true, 00:13:49.723 "unmap": true, 00:13:49.723 "flush": true, 00:13:49.723 "reset": true, 00:13:49.723 "nvme_admin": false, 00:13:49.723 "nvme_io": false, 00:13:49.723 "nvme_io_md": false, 00:13:49.723 "write_zeroes": true, 00:13:49.723 "zcopy": true, 00:13:49.723 "get_zone_info": false, 00:13:49.723 "zone_management": false, 00:13:49.723 "zone_append": false, 00:13:49.723 "compare": false, 00:13:49.723 "compare_and_write": false, 00:13:49.723 "abort": true, 00:13:49.723 "seek_hole": false, 00:13:49.723 "seek_data": false, 00:13:49.723 "copy": true, 00:13:49.723 "nvme_iov_md": false 00:13:49.723 }, 00:13:49.723 "memory_domains": [ 00:13:49.723 { 00:13:49.723 "dma_device_id": "system", 00:13:49.723 "dma_device_type": 1 00:13:49.723 }, 00:13:49.723 { 00:13:49.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.723 "dma_device_type": 2 00:13:49.723 } 00:13:49.723 ], 00:13:49.723 "driver_specific": {} 00:13:49.723 }' 00:13:49.723 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.723 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.723 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:49.723 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.723 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.723 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.723 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.982 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.982 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.982 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.982 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.982 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:49.982 00:08:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:50.242 [2024-07-16 00:08:37.030556] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:50.242 [2024-07-16 00:08:37.030587] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:50.242 [2024-07-16 00:08:37.030629] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.242 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.501 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.501 "name": "Existed_Raid", 00:13:50.501 "uuid": "d559fd14-31ea-4f50-90ea-5cb494bce374", 00:13:50.501 "strip_size_kb": 64, 00:13:50.501 "state": "offline", 00:13:50.501 "raid_level": "raid0", 00:13:50.501 "superblock": true, 00:13:50.501 "num_base_bdevs": 3, 00:13:50.501 "num_base_bdevs_discovered": 2, 00:13:50.501 "num_base_bdevs_operational": 2, 00:13:50.501 "base_bdevs_list": [ 00:13:50.501 { 00:13:50.501 "name": null, 00:13:50.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.501 "is_configured": false, 00:13:50.501 "data_offset": 2048, 00:13:50.501 "data_size": 63488 00:13:50.501 }, 00:13:50.501 { 00:13:50.501 "name": "BaseBdev2", 00:13:50.501 "uuid": "d1beb451-2b43-42a0-b70b-ea18de3a85fb", 00:13:50.501 "is_configured": true, 00:13:50.501 "data_offset": 2048, 00:13:50.501 "data_size": 63488 00:13:50.501 }, 00:13:50.501 { 00:13:50.501 "name": "BaseBdev3", 00:13:50.501 "uuid": "9095e39b-8e7e-47a4-a6d8-28aa3971888d", 00:13:50.501 "is_configured": true, 00:13:50.501 "data_offset": 2048, 00:13:50.501 "data_size": 63488 00:13:50.501 } 00:13:50.501 ] 00:13:50.501 }' 00:13:50.501 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.501 00:08:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:51.071 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:51.071 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:51.071 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.071 00:08:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:51.071 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:51.071 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:51.071 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:51.330 [2024-07-16 00:08:38.243690] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:51.330 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:51.330 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:51.589 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.589 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:51.589 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:51.589 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:51.589 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:51.849 [2024-07-16 00:08:38.753518] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:51.849 [2024-07-16 00:08:38.753560] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x88a400 name Existed_Raid, state offline 00:13:51.849 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:51.849 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:51.849 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.849 00:08:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:52.108 00:08:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:52.108 00:08:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:52.108 00:08:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:52.108 00:08:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:52.108 00:08:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:52.108 00:08:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:52.367 BaseBdev2 00:13:52.367 00:08:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:52.367 00:08:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:52.367 00:08:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:52.367 00:08:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:52.367 00:08:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:52.367 00:08:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:52.367 00:08:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:52.627 00:08:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:52.886 [ 00:13:52.886 { 00:13:52.886 "name": "BaseBdev2", 00:13:52.886 "aliases": [ 00:13:52.886 "adeb5c92-aefb-4d4f-9c8c-7d0344037e36" 00:13:52.886 ], 00:13:52.886 "product_name": "Malloc disk", 00:13:52.886 "block_size": 512, 00:13:52.886 "num_blocks": 65536, 00:13:52.886 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:13:52.886 "assigned_rate_limits": { 00:13:52.886 "rw_ios_per_sec": 0, 00:13:52.886 "rw_mbytes_per_sec": 0, 00:13:52.886 "r_mbytes_per_sec": 0, 00:13:52.886 "w_mbytes_per_sec": 0 00:13:52.886 }, 00:13:52.886 "claimed": false, 00:13:52.886 "zoned": false, 00:13:52.886 "supported_io_types": { 00:13:52.886 "read": true, 00:13:52.886 "write": true, 00:13:52.886 "unmap": true, 00:13:52.886 "flush": true, 00:13:52.886 "reset": true, 00:13:52.886 "nvme_admin": false, 00:13:52.886 "nvme_io": false, 00:13:52.886 "nvme_io_md": false, 00:13:52.886 "write_zeroes": true, 00:13:52.886 "zcopy": true, 00:13:52.886 "get_zone_info": false, 00:13:52.886 "zone_management": false, 00:13:52.886 "zone_append": false, 00:13:52.886 "compare": false, 00:13:52.886 "compare_and_write": false, 00:13:52.886 "abort": true, 00:13:52.886 "seek_hole": false, 00:13:52.886 "seek_data": false, 00:13:52.886 "copy": true, 00:13:52.886 "nvme_iov_md": false 00:13:52.886 }, 00:13:52.886 "memory_domains": [ 00:13:52.886 { 00:13:52.886 "dma_device_id": "system", 00:13:52.886 "dma_device_type": 1 00:13:52.886 }, 00:13:52.886 { 00:13:52.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.886 "dma_device_type": 2 00:13:52.886 } 00:13:52.886 ], 00:13:52.886 "driver_specific": {} 00:13:52.886 } 00:13:52.886 ] 00:13:52.886 00:08:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:52.886 00:08:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:52.886 00:08:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:52.886 00:08:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:53.144 BaseBdev3 00:13:53.145 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:53.145 00:08:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:53.145 00:08:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:53.145 00:08:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:53.145 00:08:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:53.145 00:08:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:53.145 00:08:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:53.404 00:08:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:53.663 [ 00:13:53.663 { 00:13:53.663 "name": "BaseBdev3", 00:13:53.663 "aliases": [ 00:13:53.663 "eb9655b3-e252-4c78-8531-2b8f55d491fa" 00:13:53.663 ], 00:13:53.663 "product_name": "Malloc disk", 00:13:53.663 "block_size": 512, 00:13:53.663 "num_blocks": 65536, 00:13:53.663 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:13:53.663 "assigned_rate_limits": { 00:13:53.663 "rw_ios_per_sec": 0, 00:13:53.663 "rw_mbytes_per_sec": 0, 00:13:53.663 "r_mbytes_per_sec": 0, 00:13:53.663 "w_mbytes_per_sec": 0 00:13:53.663 }, 00:13:53.663 "claimed": false, 00:13:53.663 "zoned": false, 00:13:53.663 "supported_io_types": { 00:13:53.663 "read": true, 00:13:53.663 "write": true, 00:13:53.663 "unmap": true, 00:13:53.663 "flush": true, 00:13:53.663 "reset": true, 00:13:53.663 "nvme_admin": false, 00:13:53.663 "nvme_io": false, 00:13:53.663 "nvme_io_md": false, 00:13:53.663 "write_zeroes": true, 00:13:53.663 "zcopy": true, 00:13:53.663 "get_zone_info": false, 00:13:53.663 "zone_management": false, 00:13:53.663 "zone_append": false, 00:13:53.663 "compare": false, 00:13:53.663 "compare_and_write": false, 00:13:53.663 "abort": true, 00:13:53.663 "seek_hole": false, 00:13:53.663 "seek_data": false, 00:13:53.663 "copy": true, 00:13:53.663 "nvme_iov_md": false 00:13:53.663 }, 00:13:53.663 "memory_domains": [ 00:13:53.663 { 00:13:53.663 "dma_device_id": "system", 00:13:53.663 "dma_device_type": 1 00:13:53.663 }, 00:13:53.663 { 00:13:53.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.663 "dma_device_type": 2 00:13:53.663 } 00:13:53.663 ], 00:13:53.663 "driver_specific": {} 00:13:53.663 } 00:13:53.663 ] 00:13:53.663 00:08:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:53.663 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:53.663 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:53.663 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:53.923 [2024-07-16 00:08:40.730352] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:53.923 [2024-07-16 00:08:40.730400] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:53.923 [2024-07-16 00:08:40.730420] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:53.923 [2024-07-16 00:08:40.731798] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.923 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:54.182 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:54.182 "name": "Existed_Raid", 00:13:54.182 "uuid": "06875a66-4989-4272-9928-b7ee10d5f053", 00:13:54.182 "strip_size_kb": 64, 00:13:54.182 "state": "configuring", 00:13:54.182 "raid_level": "raid0", 00:13:54.182 "superblock": true, 00:13:54.182 "num_base_bdevs": 3, 00:13:54.182 "num_base_bdevs_discovered": 2, 00:13:54.182 "num_base_bdevs_operational": 3, 00:13:54.182 "base_bdevs_list": [ 00:13:54.182 { 00:13:54.182 "name": "BaseBdev1", 00:13:54.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:54.182 "is_configured": false, 00:13:54.182 "data_offset": 0, 00:13:54.182 "data_size": 0 00:13:54.182 }, 00:13:54.182 { 00:13:54.182 "name": "BaseBdev2", 00:13:54.182 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:13:54.182 "is_configured": true, 00:13:54.182 "data_offset": 2048, 00:13:54.182 "data_size": 63488 00:13:54.182 }, 00:13:54.182 { 00:13:54.182 "name": "BaseBdev3", 00:13:54.182 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:13:54.182 "is_configured": true, 00:13:54.182 "data_offset": 2048, 00:13:54.182 "data_size": 63488 00:13:54.182 } 00:13:54.182 ] 00:13:54.182 }' 00:13:54.182 00:08:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:54.182 00:08:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:54.750 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:55.017 [2024-07-16 00:08:41.813205] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.017 00:08:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.277 00:08:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.277 "name": "Existed_Raid", 00:13:55.277 "uuid": "06875a66-4989-4272-9928-b7ee10d5f053", 00:13:55.277 "strip_size_kb": 64, 00:13:55.277 "state": "configuring", 00:13:55.277 "raid_level": "raid0", 00:13:55.277 "superblock": true, 00:13:55.277 "num_base_bdevs": 3, 00:13:55.277 "num_base_bdevs_discovered": 1, 00:13:55.277 "num_base_bdevs_operational": 3, 00:13:55.277 "base_bdevs_list": [ 00:13:55.277 { 00:13:55.277 "name": "BaseBdev1", 00:13:55.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.277 "is_configured": false, 00:13:55.277 "data_offset": 0, 00:13:55.277 "data_size": 0 00:13:55.277 }, 00:13:55.277 { 00:13:55.277 "name": null, 00:13:55.277 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:13:55.277 "is_configured": false, 00:13:55.277 "data_offset": 2048, 00:13:55.277 "data_size": 63488 00:13:55.277 }, 00:13:55.277 { 00:13:55.277 "name": "BaseBdev3", 00:13:55.277 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:13:55.277 "is_configured": true, 00:13:55.277 "data_offset": 2048, 00:13:55.277 "data_size": 63488 00:13:55.277 } 00:13:55.277 ] 00:13:55.277 }' 00:13:55.277 00:08:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.277 00:08:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:55.842 00:08:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.842 00:08:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:56.100 00:08:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:56.100 00:08:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:56.359 [2024-07-16 00:08:43.137372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:56.359 BaseBdev1 00:13:56.359 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:56.359 00:08:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:56.359 00:08:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:56.359 00:08:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:56.359 00:08:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:56.359 00:08:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:56.359 00:08:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:56.617 00:08:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:56.875 [ 00:13:56.875 { 00:13:56.875 "name": "BaseBdev1", 00:13:56.875 "aliases": [ 00:13:56.875 "05578707-8593-4f3f-84ec-4e04f00186a8" 00:13:56.875 ], 00:13:56.875 "product_name": "Malloc disk", 00:13:56.875 "block_size": 512, 00:13:56.875 "num_blocks": 65536, 00:13:56.875 "uuid": "05578707-8593-4f3f-84ec-4e04f00186a8", 00:13:56.875 "assigned_rate_limits": { 00:13:56.875 "rw_ios_per_sec": 0, 00:13:56.875 "rw_mbytes_per_sec": 0, 00:13:56.875 "r_mbytes_per_sec": 0, 00:13:56.875 "w_mbytes_per_sec": 0 00:13:56.875 }, 00:13:56.875 "claimed": true, 00:13:56.875 "claim_type": "exclusive_write", 00:13:56.875 "zoned": false, 00:13:56.875 "supported_io_types": { 00:13:56.875 "read": true, 00:13:56.875 "write": true, 00:13:56.875 "unmap": true, 00:13:56.875 "flush": true, 00:13:56.875 "reset": true, 00:13:56.875 "nvme_admin": false, 00:13:56.875 "nvme_io": false, 00:13:56.875 "nvme_io_md": false, 00:13:56.875 "write_zeroes": true, 00:13:56.875 "zcopy": true, 00:13:56.875 "get_zone_info": false, 00:13:56.875 "zone_management": false, 00:13:56.875 "zone_append": false, 00:13:56.875 "compare": false, 00:13:56.875 "compare_and_write": false, 00:13:56.875 "abort": true, 00:13:56.876 "seek_hole": false, 00:13:56.876 "seek_data": false, 00:13:56.876 "copy": true, 00:13:56.876 "nvme_iov_md": false 00:13:56.876 }, 00:13:56.876 "memory_domains": [ 00:13:56.876 { 00:13:56.876 "dma_device_id": "system", 00:13:56.876 "dma_device_type": 1 00:13:56.876 }, 00:13:56.876 { 00:13:56.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.876 "dma_device_type": 2 00:13:56.876 } 00:13:56.876 ], 00:13:56.876 "driver_specific": {} 00:13:56.876 } 00:13:56.876 ] 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.876 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.134 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.134 "name": "Existed_Raid", 00:13:57.134 "uuid": "06875a66-4989-4272-9928-b7ee10d5f053", 00:13:57.134 "strip_size_kb": 64, 00:13:57.134 "state": "configuring", 00:13:57.134 "raid_level": "raid0", 00:13:57.134 "superblock": true, 00:13:57.134 "num_base_bdevs": 3, 00:13:57.134 "num_base_bdevs_discovered": 2, 00:13:57.134 "num_base_bdevs_operational": 3, 00:13:57.134 "base_bdevs_list": [ 00:13:57.134 { 00:13:57.134 "name": "BaseBdev1", 00:13:57.134 "uuid": "05578707-8593-4f3f-84ec-4e04f00186a8", 00:13:57.134 "is_configured": true, 00:13:57.134 "data_offset": 2048, 00:13:57.134 "data_size": 63488 00:13:57.134 }, 00:13:57.134 { 00:13:57.134 "name": null, 00:13:57.134 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:13:57.134 "is_configured": false, 00:13:57.134 "data_offset": 2048, 00:13:57.134 "data_size": 63488 00:13:57.134 }, 00:13:57.134 { 00:13:57.134 "name": "BaseBdev3", 00:13:57.134 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:13:57.134 "is_configured": true, 00:13:57.134 "data_offset": 2048, 00:13:57.134 "data_size": 63488 00:13:57.134 } 00:13:57.134 ] 00:13:57.134 }' 00:13:57.134 00:08:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.134 00:08:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.700 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.700 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:57.700 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:57.700 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:57.959 [2024-07-16 00:08:44.801829] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.959 00:08:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.218 00:08:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.218 "name": "Existed_Raid", 00:13:58.218 "uuid": "06875a66-4989-4272-9928-b7ee10d5f053", 00:13:58.218 "strip_size_kb": 64, 00:13:58.218 "state": "configuring", 00:13:58.218 "raid_level": "raid0", 00:13:58.218 "superblock": true, 00:13:58.218 "num_base_bdevs": 3, 00:13:58.218 "num_base_bdevs_discovered": 1, 00:13:58.218 "num_base_bdevs_operational": 3, 00:13:58.218 "base_bdevs_list": [ 00:13:58.218 { 00:13:58.218 "name": "BaseBdev1", 00:13:58.218 "uuid": "05578707-8593-4f3f-84ec-4e04f00186a8", 00:13:58.218 "is_configured": true, 00:13:58.218 "data_offset": 2048, 00:13:58.218 "data_size": 63488 00:13:58.218 }, 00:13:58.218 { 00:13:58.218 "name": null, 00:13:58.218 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:13:58.218 "is_configured": false, 00:13:58.218 "data_offset": 2048, 00:13:58.218 "data_size": 63488 00:13:58.218 }, 00:13:58.218 { 00:13:58.218 "name": null, 00:13:58.218 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:13:58.218 "is_configured": false, 00:13:58.218 "data_offset": 2048, 00:13:58.218 "data_size": 63488 00:13:58.218 } 00:13:58.218 ] 00:13:58.218 }' 00:13:58.218 00:08:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.218 00:08:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:58.784 00:08:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.784 00:08:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:59.043 00:08:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:59.043 00:08:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:59.302 [2024-07-16 00:08:46.137401] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.302 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.560 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.560 "name": "Existed_Raid", 00:13:59.560 "uuid": "06875a66-4989-4272-9928-b7ee10d5f053", 00:13:59.560 "strip_size_kb": 64, 00:13:59.560 "state": "configuring", 00:13:59.560 "raid_level": "raid0", 00:13:59.560 "superblock": true, 00:13:59.560 "num_base_bdevs": 3, 00:13:59.560 "num_base_bdevs_discovered": 2, 00:13:59.560 "num_base_bdevs_operational": 3, 00:13:59.560 "base_bdevs_list": [ 00:13:59.560 { 00:13:59.560 "name": "BaseBdev1", 00:13:59.560 "uuid": "05578707-8593-4f3f-84ec-4e04f00186a8", 00:13:59.560 "is_configured": true, 00:13:59.560 "data_offset": 2048, 00:13:59.560 "data_size": 63488 00:13:59.560 }, 00:13:59.560 { 00:13:59.560 "name": null, 00:13:59.560 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:13:59.560 "is_configured": false, 00:13:59.560 "data_offset": 2048, 00:13:59.560 "data_size": 63488 00:13:59.560 }, 00:13:59.560 { 00:13:59.560 "name": "BaseBdev3", 00:13:59.560 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:13:59.560 "is_configured": true, 00:13:59.560 "data_offset": 2048, 00:13:59.560 "data_size": 63488 00:13:59.560 } 00:13:59.560 ] 00:13:59.560 }' 00:13:59.560 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.560 00:08:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:00.127 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.127 00:08:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:00.386 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:00.386 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:00.954 [2024-07-16 00:08:47.717602] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.954 00:08:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.212 00:08:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.212 "name": "Existed_Raid", 00:14:01.212 "uuid": "06875a66-4989-4272-9928-b7ee10d5f053", 00:14:01.212 "strip_size_kb": 64, 00:14:01.212 "state": "configuring", 00:14:01.212 "raid_level": "raid0", 00:14:01.212 "superblock": true, 00:14:01.212 "num_base_bdevs": 3, 00:14:01.212 "num_base_bdevs_discovered": 1, 00:14:01.212 "num_base_bdevs_operational": 3, 00:14:01.212 "base_bdevs_list": [ 00:14:01.212 { 00:14:01.212 "name": null, 00:14:01.212 "uuid": "05578707-8593-4f3f-84ec-4e04f00186a8", 00:14:01.212 "is_configured": false, 00:14:01.212 "data_offset": 2048, 00:14:01.212 "data_size": 63488 00:14:01.212 }, 00:14:01.212 { 00:14:01.212 "name": null, 00:14:01.212 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:14:01.212 "is_configured": false, 00:14:01.212 "data_offset": 2048, 00:14:01.212 "data_size": 63488 00:14:01.212 }, 00:14:01.212 { 00:14:01.212 "name": "BaseBdev3", 00:14:01.212 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:14:01.212 "is_configured": true, 00:14:01.212 "data_offset": 2048, 00:14:01.212 "data_size": 63488 00:14:01.212 } 00:14:01.212 ] 00:14:01.212 }' 00:14:01.212 00:08:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.212 00:08:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:01.778 00:08:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.778 00:08:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:02.036 00:08:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:02.036 00:08:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:02.295 [2024-07-16 00:08:49.091803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:02.295 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:02.295 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.296 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.296 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:02.296 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.296 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.296 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.296 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.296 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.296 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.296 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.296 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.554 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.554 "name": "Existed_Raid", 00:14:02.554 "uuid": "06875a66-4989-4272-9928-b7ee10d5f053", 00:14:02.554 "strip_size_kb": 64, 00:14:02.554 "state": "configuring", 00:14:02.554 "raid_level": "raid0", 00:14:02.554 "superblock": true, 00:14:02.554 "num_base_bdevs": 3, 00:14:02.554 "num_base_bdevs_discovered": 2, 00:14:02.554 "num_base_bdevs_operational": 3, 00:14:02.554 "base_bdevs_list": [ 00:14:02.554 { 00:14:02.554 "name": null, 00:14:02.555 "uuid": "05578707-8593-4f3f-84ec-4e04f00186a8", 00:14:02.555 "is_configured": false, 00:14:02.555 "data_offset": 2048, 00:14:02.555 "data_size": 63488 00:14:02.555 }, 00:14:02.555 { 00:14:02.555 "name": "BaseBdev2", 00:14:02.555 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:14:02.555 "is_configured": true, 00:14:02.555 "data_offset": 2048, 00:14:02.555 "data_size": 63488 00:14:02.555 }, 00:14:02.555 { 00:14:02.555 "name": "BaseBdev3", 00:14:02.555 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:14:02.555 "is_configured": true, 00:14:02.555 "data_offset": 2048, 00:14:02.555 "data_size": 63488 00:14:02.555 } 00:14:02.555 ] 00:14:02.555 }' 00:14:02.555 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.555 00:08:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.121 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:03.121 00:08:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.379 00:08:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:03.379 00:08:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.379 00:08:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:03.637 00:08:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 05578707-8593-4f3f-84ec-4e04f00186a8 00:14:03.896 [2024-07-16 00:08:50.712695] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:03.896 [2024-07-16 00:08:50.712858] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x888e90 00:14:03.896 [2024-07-16 00:08:50.712872] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:03.896 [2024-07-16 00:08:50.713066] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x58f940 00:14:03.896 [2024-07-16 00:08:50.713189] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x888e90 00:14:03.896 [2024-07-16 00:08:50.713199] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x888e90 00:14:03.896 [2024-07-16 00:08:50.713292] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:03.896 NewBaseBdev 00:14:03.896 00:08:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:03.896 00:08:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:03.896 00:08:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:03.896 00:08:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:03.896 00:08:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:03.896 00:08:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:03.896 00:08:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.154 00:08:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:04.413 [ 00:14:04.413 { 00:14:04.413 "name": "NewBaseBdev", 00:14:04.413 "aliases": [ 00:14:04.413 "05578707-8593-4f3f-84ec-4e04f00186a8" 00:14:04.413 ], 00:14:04.413 "product_name": "Malloc disk", 00:14:04.413 "block_size": 512, 00:14:04.413 "num_blocks": 65536, 00:14:04.413 "uuid": "05578707-8593-4f3f-84ec-4e04f00186a8", 00:14:04.413 "assigned_rate_limits": { 00:14:04.413 "rw_ios_per_sec": 0, 00:14:04.413 "rw_mbytes_per_sec": 0, 00:14:04.413 "r_mbytes_per_sec": 0, 00:14:04.413 "w_mbytes_per_sec": 0 00:14:04.413 }, 00:14:04.413 "claimed": true, 00:14:04.413 "claim_type": "exclusive_write", 00:14:04.413 "zoned": false, 00:14:04.413 "supported_io_types": { 00:14:04.413 "read": true, 00:14:04.413 "write": true, 00:14:04.413 "unmap": true, 00:14:04.413 "flush": true, 00:14:04.413 "reset": true, 00:14:04.413 "nvme_admin": false, 00:14:04.413 "nvme_io": false, 00:14:04.413 "nvme_io_md": false, 00:14:04.413 "write_zeroes": true, 00:14:04.413 "zcopy": true, 00:14:04.413 "get_zone_info": false, 00:14:04.413 "zone_management": false, 00:14:04.413 "zone_append": false, 00:14:04.413 "compare": false, 00:14:04.413 "compare_and_write": false, 00:14:04.413 "abort": true, 00:14:04.413 "seek_hole": false, 00:14:04.413 "seek_data": false, 00:14:04.413 "copy": true, 00:14:04.413 "nvme_iov_md": false 00:14:04.413 }, 00:14:04.413 "memory_domains": [ 00:14:04.413 { 00:14:04.413 "dma_device_id": "system", 00:14:04.413 "dma_device_type": 1 00:14:04.413 }, 00:14:04.413 { 00:14:04.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.413 "dma_device_type": 2 00:14:04.413 } 00:14:04.413 ], 00:14:04.413 "driver_specific": {} 00:14:04.413 } 00:14:04.413 ] 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.413 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.671 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.671 "name": "Existed_Raid", 00:14:04.671 "uuid": "06875a66-4989-4272-9928-b7ee10d5f053", 00:14:04.671 "strip_size_kb": 64, 00:14:04.671 "state": "online", 00:14:04.671 "raid_level": "raid0", 00:14:04.671 "superblock": true, 00:14:04.671 "num_base_bdevs": 3, 00:14:04.671 "num_base_bdevs_discovered": 3, 00:14:04.671 "num_base_bdevs_operational": 3, 00:14:04.671 "base_bdevs_list": [ 00:14:04.671 { 00:14:04.671 "name": "NewBaseBdev", 00:14:04.671 "uuid": "05578707-8593-4f3f-84ec-4e04f00186a8", 00:14:04.671 "is_configured": true, 00:14:04.671 "data_offset": 2048, 00:14:04.671 "data_size": 63488 00:14:04.671 }, 00:14:04.671 { 00:14:04.671 "name": "BaseBdev2", 00:14:04.671 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:14:04.671 "is_configured": true, 00:14:04.671 "data_offset": 2048, 00:14:04.671 "data_size": 63488 00:14:04.671 }, 00:14:04.671 { 00:14:04.671 "name": "BaseBdev3", 00:14:04.671 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:14:04.671 "is_configured": true, 00:14:04.671 "data_offset": 2048, 00:14:04.671 "data_size": 63488 00:14:04.671 } 00:14:04.671 ] 00:14:04.671 }' 00:14:04.671 00:08:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.671 00:08:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:05.237 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:05.237 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:05.237 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:05.237 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:05.237 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:05.237 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:05.237 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:05.237 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:05.495 [2024-07-16 00:08:52.277186] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:05.495 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:05.495 "name": "Existed_Raid", 00:14:05.495 "aliases": [ 00:14:05.495 "06875a66-4989-4272-9928-b7ee10d5f053" 00:14:05.495 ], 00:14:05.495 "product_name": "Raid Volume", 00:14:05.495 "block_size": 512, 00:14:05.495 "num_blocks": 190464, 00:14:05.495 "uuid": "06875a66-4989-4272-9928-b7ee10d5f053", 00:14:05.495 "assigned_rate_limits": { 00:14:05.495 "rw_ios_per_sec": 0, 00:14:05.495 "rw_mbytes_per_sec": 0, 00:14:05.495 "r_mbytes_per_sec": 0, 00:14:05.495 "w_mbytes_per_sec": 0 00:14:05.495 }, 00:14:05.495 "claimed": false, 00:14:05.495 "zoned": false, 00:14:05.495 "supported_io_types": { 00:14:05.495 "read": true, 00:14:05.495 "write": true, 00:14:05.495 "unmap": true, 00:14:05.495 "flush": true, 00:14:05.495 "reset": true, 00:14:05.495 "nvme_admin": false, 00:14:05.495 "nvme_io": false, 00:14:05.495 "nvme_io_md": false, 00:14:05.495 "write_zeroes": true, 00:14:05.495 "zcopy": false, 00:14:05.495 "get_zone_info": false, 00:14:05.495 "zone_management": false, 00:14:05.495 "zone_append": false, 00:14:05.495 "compare": false, 00:14:05.495 "compare_and_write": false, 00:14:05.495 "abort": false, 00:14:05.495 "seek_hole": false, 00:14:05.495 "seek_data": false, 00:14:05.495 "copy": false, 00:14:05.495 "nvme_iov_md": false 00:14:05.495 }, 00:14:05.495 "memory_domains": [ 00:14:05.495 { 00:14:05.495 "dma_device_id": "system", 00:14:05.495 "dma_device_type": 1 00:14:05.495 }, 00:14:05.495 { 00:14:05.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.495 "dma_device_type": 2 00:14:05.495 }, 00:14:05.495 { 00:14:05.495 "dma_device_id": "system", 00:14:05.495 "dma_device_type": 1 00:14:05.495 }, 00:14:05.495 { 00:14:05.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.495 "dma_device_type": 2 00:14:05.495 }, 00:14:05.495 { 00:14:05.495 "dma_device_id": "system", 00:14:05.495 "dma_device_type": 1 00:14:05.495 }, 00:14:05.495 { 00:14:05.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.495 "dma_device_type": 2 00:14:05.495 } 00:14:05.495 ], 00:14:05.495 "driver_specific": { 00:14:05.495 "raid": { 00:14:05.495 "uuid": "06875a66-4989-4272-9928-b7ee10d5f053", 00:14:05.495 "strip_size_kb": 64, 00:14:05.495 "state": "online", 00:14:05.495 "raid_level": "raid0", 00:14:05.495 "superblock": true, 00:14:05.496 "num_base_bdevs": 3, 00:14:05.496 "num_base_bdevs_discovered": 3, 00:14:05.496 "num_base_bdevs_operational": 3, 00:14:05.496 "base_bdevs_list": [ 00:14:05.496 { 00:14:05.496 "name": "NewBaseBdev", 00:14:05.496 "uuid": "05578707-8593-4f3f-84ec-4e04f00186a8", 00:14:05.496 "is_configured": true, 00:14:05.496 "data_offset": 2048, 00:14:05.496 "data_size": 63488 00:14:05.496 }, 00:14:05.496 { 00:14:05.496 "name": "BaseBdev2", 00:14:05.496 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:14:05.496 "is_configured": true, 00:14:05.496 "data_offset": 2048, 00:14:05.496 "data_size": 63488 00:14:05.496 }, 00:14:05.496 { 00:14:05.496 "name": "BaseBdev3", 00:14:05.496 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:14:05.496 "is_configured": true, 00:14:05.496 "data_offset": 2048, 00:14:05.496 "data_size": 63488 00:14:05.496 } 00:14:05.496 ] 00:14:05.496 } 00:14:05.496 } 00:14:05.496 }' 00:14:05.496 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:05.496 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:05.496 BaseBdev2 00:14:05.496 BaseBdev3' 00:14:05.496 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:05.496 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:05.496 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:05.756 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:05.756 "name": "NewBaseBdev", 00:14:05.756 "aliases": [ 00:14:05.756 "05578707-8593-4f3f-84ec-4e04f00186a8" 00:14:05.756 ], 00:14:05.756 "product_name": "Malloc disk", 00:14:05.756 "block_size": 512, 00:14:05.756 "num_blocks": 65536, 00:14:05.756 "uuid": "05578707-8593-4f3f-84ec-4e04f00186a8", 00:14:05.756 "assigned_rate_limits": { 00:14:05.756 "rw_ios_per_sec": 0, 00:14:05.756 "rw_mbytes_per_sec": 0, 00:14:05.756 "r_mbytes_per_sec": 0, 00:14:05.756 "w_mbytes_per_sec": 0 00:14:05.756 }, 00:14:05.756 "claimed": true, 00:14:05.756 "claim_type": "exclusive_write", 00:14:05.756 "zoned": false, 00:14:05.756 "supported_io_types": { 00:14:05.756 "read": true, 00:14:05.756 "write": true, 00:14:05.756 "unmap": true, 00:14:05.756 "flush": true, 00:14:05.756 "reset": true, 00:14:05.756 "nvme_admin": false, 00:14:05.756 "nvme_io": false, 00:14:05.756 "nvme_io_md": false, 00:14:05.756 "write_zeroes": true, 00:14:05.756 "zcopy": true, 00:14:05.756 "get_zone_info": false, 00:14:05.756 "zone_management": false, 00:14:05.756 "zone_append": false, 00:14:05.756 "compare": false, 00:14:05.756 "compare_and_write": false, 00:14:05.756 "abort": true, 00:14:05.756 "seek_hole": false, 00:14:05.756 "seek_data": false, 00:14:05.756 "copy": true, 00:14:05.756 "nvme_iov_md": false 00:14:05.756 }, 00:14:05.756 "memory_domains": [ 00:14:05.756 { 00:14:05.756 "dma_device_id": "system", 00:14:05.756 "dma_device_type": 1 00:14:05.756 }, 00:14:05.756 { 00:14:05.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.756 "dma_device_type": 2 00:14:05.756 } 00:14:05.756 ], 00:14:05.756 "driver_specific": {} 00:14:05.756 }' 00:14:05.756 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:05.756 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:05.756 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:05.756 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:06.015 00:08:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:06.317 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:06.317 "name": "BaseBdev2", 00:14:06.317 "aliases": [ 00:14:06.317 "adeb5c92-aefb-4d4f-9c8c-7d0344037e36" 00:14:06.317 ], 00:14:06.317 "product_name": "Malloc disk", 00:14:06.317 "block_size": 512, 00:14:06.317 "num_blocks": 65536, 00:14:06.317 "uuid": "adeb5c92-aefb-4d4f-9c8c-7d0344037e36", 00:14:06.317 "assigned_rate_limits": { 00:14:06.317 "rw_ios_per_sec": 0, 00:14:06.317 "rw_mbytes_per_sec": 0, 00:14:06.317 "r_mbytes_per_sec": 0, 00:14:06.317 "w_mbytes_per_sec": 0 00:14:06.317 }, 00:14:06.317 "claimed": true, 00:14:06.317 "claim_type": "exclusive_write", 00:14:06.317 "zoned": false, 00:14:06.317 "supported_io_types": { 00:14:06.317 "read": true, 00:14:06.317 "write": true, 00:14:06.317 "unmap": true, 00:14:06.317 "flush": true, 00:14:06.317 "reset": true, 00:14:06.317 "nvme_admin": false, 00:14:06.317 "nvme_io": false, 00:14:06.317 "nvme_io_md": false, 00:14:06.317 "write_zeroes": true, 00:14:06.317 "zcopy": true, 00:14:06.317 "get_zone_info": false, 00:14:06.317 "zone_management": false, 00:14:06.317 "zone_append": false, 00:14:06.317 "compare": false, 00:14:06.317 "compare_and_write": false, 00:14:06.317 "abort": true, 00:14:06.317 "seek_hole": false, 00:14:06.317 "seek_data": false, 00:14:06.317 "copy": true, 00:14:06.317 "nvme_iov_md": false 00:14:06.317 }, 00:14:06.317 "memory_domains": [ 00:14:06.317 { 00:14:06.317 "dma_device_id": "system", 00:14:06.317 "dma_device_type": 1 00:14:06.317 }, 00:14:06.317 { 00:14:06.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.317 "dma_device_type": 2 00:14:06.317 } 00:14:06.317 ], 00:14:06.317 "driver_specific": {} 00:14:06.317 }' 00:14:06.317 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.317 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.585 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:06.585 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.585 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.585 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:06.585 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.585 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.585 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:06.585 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.585 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.843 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:06.843 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.843 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:06.843 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:07.101 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:07.101 "name": "BaseBdev3", 00:14:07.101 "aliases": [ 00:14:07.101 "eb9655b3-e252-4c78-8531-2b8f55d491fa" 00:14:07.101 ], 00:14:07.101 "product_name": "Malloc disk", 00:14:07.101 "block_size": 512, 00:14:07.101 "num_blocks": 65536, 00:14:07.101 "uuid": "eb9655b3-e252-4c78-8531-2b8f55d491fa", 00:14:07.101 "assigned_rate_limits": { 00:14:07.101 "rw_ios_per_sec": 0, 00:14:07.101 "rw_mbytes_per_sec": 0, 00:14:07.101 "r_mbytes_per_sec": 0, 00:14:07.101 "w_mbytes_per_sec": 0 00:14:07.101 }, 00:14:07.101 "claimed": true, 00:14:07.101 "claim_type": "exclusive_write", 00:14:07.101 "zoned": false, 00:14:07.101 "supported_io_types": { 00:14:07.101 "read": true, 00:14:07.101 "write": true, 00:14:07.101 "unmap": true, 00:14:07.101 "flush": true, 00:14:07.101 "reset": true, 00:14:07.101 "nvme_admin": false, 00:14:07.101 "nvme_io": false, 00:14:07.101 "nvme_io_md": false, 00:14:07.101 "write_zeroes": true, 00:14:07.101 "zcopy": true, 00:14:07.101 "get_zone_info": false, 00:14:07.101 "zone_management": false, 00:14:07.101 "zone_append": false, 00:14:07.101 "compare": false, 00:14:07.101 "compare_and_write": false, 00:14:07.101 "abort": true, 00:14:07.101 "seek_hole": false, 00:14:07.101 "seek_data": false, 00:14:07.101 "copy": true, 00:14:07.101 "nvme_iov_md": false 00:14:07.101 }, 00:14:07.101 "memory_domains": [ 00:14:07.101 { 00:14:07.101 "dma_device_id": "system", 00:14:07.101 "dma_device_type": 1 00:14:07.101 }, 00:14:07.101 { 00:14:07.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.101 "dma_device_type": 2 00:14:07.101 } 00:14:07.101 ], 00:14:07.101 "driver_specific": {} 00:14:07.101 }' 00:14:07.101 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.101 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.101 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:07.101 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.101 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.101 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.101 00:08:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.101 00:08:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.359 00:08:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.359 00:08:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.359 00:08:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.359 00:08:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.359 00:08:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:07.617 [2024-07-16 00:08:54.374456] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:07.617 [2024-07-16 00:08:54.374488] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:07.617 [2024-07-16 00:08:54.374543] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:07.617 [2024-07-16 00:08:54.374591] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:07.617 [2024-07-16 00:08:54.374603] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x888e90 name Existed_Raid, state offline 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3513341 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3513341 ']' 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3513341 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3513341 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3513341' 00:14:07.617 killing process with pid 3513341 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3513341 00:14:07.617 [2024-07-16 00:08:54.458241] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:07.617 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3513341 00:14:07.617 [2024-07-16 00:08:54.485543] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:07.877 00:08:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:07.877 00:14:07.877 real 0m28.795s 00:14:07.877 user 0m52.759s 00:14:07.877 sys 0m5.191s 00:14:07.877 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:07.877 00:08:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:07.877 ************************************ 00:14:07.877 END TEST raid_state_function_test_sb 00:14:07.877 ************************************ 00:14:07.877 00:08:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:07.877 00:08:54 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:07.877 00:08:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:07.877 00:08:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:07.877 00:08:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:07.877 ************************************ 00:14:07.877 START TEST raid_superblock_test 00:14:07.877 ************************************ 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3517755 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3517755 /var/tmp/spdk-raid.sock 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3517755 ']' 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:07.877 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:07.877 00:08:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.136 [2024-07-16 00:08:54.861570] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:14:08.136 [2024-07-16 00:08:54.861641] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3517755 ] 00:14:08.136 [2024-07-16 00:08:54.992400] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:08.394 [2024-07-16 00:08:55.094089] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:08.394 [2024-07-16 00:08:55.157647] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:08.394 [2024-07-16 00:08:55.157687] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:08.959 00:08:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:09.217 malloc1 00:14:09.217 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:09.476 [2024-07-16 00:08:56.284330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:09.476 [2024-07-16 00:08:56.284381] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:09.476 [2024-07-16 00:08:56.284400] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc2e570 00:14:09.476 [2024-07-16 00:08:56.284412] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:09.476 [2024-07-16 00:08:56.285977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:09.476 [2024-07-16 00:08:56.286009] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:09.476 pt1 00:14:09.476 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:09.476 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:09.476 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:09.476 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:09.476 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:09.476 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:09.476 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:09.476 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:09.476 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:09.733 malloc2 00:14:09.733 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:09.991 [2024-07-16 00:08:56.786355] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:09.991 [2024-07-16 00:08:56.786403] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:09.991 [2024-07-16 00:08:56.786419] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc2f970 00:14:09.991 [2024-07-16 00:08:56.786432] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:09.991 [2024-07-16 00:08:56.787873] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:09.991 [2024-07-16 00:08:56.787902] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:09.991 pt2 00:14:09.991 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:09.991 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:09.991 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:09.991 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:09.991 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:09.992 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:09.992 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:09.992 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:09.992 00:08:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:10.250 malloc3 00:14:10.250 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:10.508 [2024-07-16 00:08:57.304434] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:10.508 [2024-07-16 00:08:57.304482] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:10.508 [2024-07-16 00:08:57.304500] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc6340 00:14:10.508 [2024-07-16 00:08:57.304513] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:10.508 [2024-07-16 00:08:57.306042] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:10.508 [2024-07-16 00:08:57.306071] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:10.508 pt3 00:14:10.508 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:10.508 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:10.508 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:10.766 [2024-07-16 00:08:57.565141] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:10.766 [2024-07-16 00:08:57.566396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:10.766 [2024-07-16 00:08:57.566449] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:10.766 [2024-07-16 00:08:57.566601] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc26ea0 00:14:10.766 [2024-07-16 00:08:57.566612] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:10.766 [2024-07-16 00:08:57.566811] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc2e240 00:14:10.766 [2024-07-16 00:08:57.566961] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc26ea0 00:14:10.766 [2024-07-16 00:08:57.566972] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc26ea0 00:14:10.766 [2024-07-16 00:08:57.567068] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:10.766 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.026 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.026 "name": "raid_bdev1", 00:14:11.026 "uuid": "9dff35d8-c1ed-4f45-b278-9e0f6ae3c809", 00:14:11.026 "strip_size_kb": 64, 00:14:11.026 "state": "online", 00:14:11.026 "raid_level": "raid0", 00:14:11.026 "superblock": true, 00:14:11.026 "num_base_bdevs": 3, 00:14:11.026 "num_base_bdevs_discovered": 3, 00:14:11.026 "num_base_bdevs_operational": 3, 00:14:11.026 "base_bdevs_list": [ 00:14:11.026 { 00:14:11.026 "name": "pt1", 00:14:11.026 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:11.026 "is_configured": true, 00:14:11.026 "data_offset": 2048, 00:14:11.026 "data_size": 63488 00:14:11.026 }, 00:14:11.026 { 00:14:11.026 "name": "pt2", 00:14:11.026 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:11.026 "is_configured": true, 00:14:11.026 "data_offset": 2048, 00:14:11.026 "data_size": 63488 00:14:11.026 }, 00:14:11.026 { 00:14:11.026 "name": "pt3", 00:14:11.026 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:11.026 "is_configured": true, 00:14:11.026 "data_offset": 2048, 00:14:11.026 "data_size": 63488 00:14:11.026 } 00:14:11.026 ] 00:14:11.026 }' 00:14:11.026 00:08:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.026 00:08:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:11.593 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:11.593 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:11.593 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:11.593 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:11.593 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:11.593 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:11.593 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:11.593 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:11.850 [2024-07-16 00:08:58.668316] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:11.850 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:11.850 "name": "raid_bdev1", 00:14:11.850 "aliases": [ 00:14:11.850 "9dff35d8-c1ed-4f45-b278-9e0f6ae3c809" 00:14:11.850 ], 00:14:11.850 "product_name": "Raid Volume", 00:14:11.850 "block_size": 512, 00:14:11.850 "num_blocks": 190464, 00:14:11.850 "uuid": "9dff35d8-c1ed-4f45-b278-9e0f6ae3c809", 00:14:11.850 "assigned_rate_limits": { 00:14:11.850 "rw_ios_per_sec": 0, 00:14:11.850 "rw_mbytes_per_sec": 0, 00:14:11.850 "r_mbytes_per_sec": 0, 00:14:11.850 "w_mbytes_per_sec": 0 00:14:11.850 }, 00:14:11.850 "claimed": false, 00:14:11.850 "zoned": false, 00:14:11.850 "supported_io_types": { 00:14:11.850 "read": true, 00:14:11.850 "write": true, 00:14:11.851 "unmap": true, 00:14:11.851 "flush": true, 00:14:11.851 "reset": true, 00:14:11.851 "nvme_admin": false, 00:14:11.851 "nvme_io": false, 00:14:11.851 "nvme_io_md": false, 00:14:11.851 "write_zeroes": true, 00:14:11.851 "zcopy": false, 00:14:11.851 "get_zone_info": false, 00:14:11.851 "zone_management": false, 00:14:11.851 "zone_append": false, 00:14:11.851 "compare": false, 00:14:11.851 "compare_and_write": false, 00:14:11.851 "abort": false, 00:14:11.851 "seek_hole": false, 00:14:11.851 "seek_data": false, 00:14:11.851 "copy": false, 00:14:11.851 "nvme_iov_md": false 00:14:11.851 }, 00:14:11.851 "memory_domains": [ 00:14:11.851 { 00:14:11.851 "dma_device_id": "system", 00:14:11.851 "dma_device_type": 1 00:14:11.851 }, 00:14:11.851 { 00:14:11.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.851 "dma_device_type": 2 00:14:11.851 }, 00:14:11.851 { 00:14:11.851 "dma_device_id": "system", 00:14:11.851 "dma_device_type": 1 00:14:11.851 }, 00:14:11.851 { 00:14:11.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.851 "dma_device_type": 2 00:14:11.851 }, 00:14:11.851 { 00:14:11.851 "dma_device_id": "system", 00:14:11.851 "dma_device_type": 1 00:14:11.851 }, 00:14:11.851 { 00:14:11.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.851 "dma_device_type": 2 00:14:11.851 } 00:14:11.851 ], 00:14:11.851 "driver_specific": { 00:14:11.851 "raid": { 00:14:11.851 "uuid": "9dff35d8-c1ed-4f45-b278-9e0f6ae3c809", 00:14:11.851 "strip_size_kb": 64, 00:14:11.851 "state": "online", 00:14:11.851 "raid_level": "raid0", 00:14:11.851 "superblock": true, 00:14:11.851 "num_base_bdevs": 3, 00:14:11.851 "num_base_bdevs_discovered": 3, 00:14:11.851 "num_base_bdevs_operational": 3, 00:14:11.851 "base_bdevs_list": [ 00:14:11.851 { 00:14:11.851 "name": "pt1", 00:14:11.851 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:11.851 "is_configured": true, 00:14:11.851 "data_offset": 2048, 00:14:11.851 "data_size": 63488 00:14:11.851 }, 00:14:11.851 { 00:14:11.851 "name": "pt2", 00:14:11.851 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:11.851 "is_configured": true, 00:14:11.851 "data_offset": 2048, 00:14:11.851 "data_size": 63488 00:14:11.851 }, 00:14:11.851 { 00:14:11.851 "name": "pt3", 00:14:11.851 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:11.851 "is_configured": true, 00:14:11.851 "data_offset": 2048, 00:14:11.851 "data_size": 63488 00:14:11.851 } 00:14:11.851 ] 00:14:11.851 } 00:14:11.851 } 00:14:11.851 }' 00:14:11.851 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:11.851 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:11.851 pt2 00:14:11.851 pt3' 00:14:11.851 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:11.851 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:11.851 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.108 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.108 "name": "pt1", 00:14:12.108 "aliases": [ 00:14:12.108 "00000000-0000-0000-0000-000000000001" 00:14:12.108 ], 00:14:12.108 "product_name": "passthru", 00:14:12.108 "block_size": 512, 00:14:12.108 "num_blocks": 65536, 00:14:12.108 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:12.108 "assigned_rate_limits": { 00:14:12.108 "rw_ios_per_sec": 0, 00:14:12.108 "rw_mbytes_per_sec": 0, 00:14:12.108 "r_mbytes_per_sec": 0, 00:14:12.108 "w_mbytes_per_sec": 0 00:14:12.108 }, 00:14:12.108 "claimed": true, 00:14:12.108 "claim_type": "exclusive_write", 00:14:12.108 "zoned": false, 00:14:12.108 "supported_io_types": { 00:14:12.108 "read": true, 00:14:12.108 "write": true, 00:14:12.108 "unmap": true, 00:14:12.108 "flush": true, 00:14:12.108 "reset": true, 00:14:12.108 "nvme_admin": false, 00:14:12.108 "nvme_io": false, 00:14:12.108 "nvme_io_md": false, 00:14:12.108 "write_zeroes": true, 00:14:12.108 "zcopy": true, 00:14:12.108 "get_zone_info": false, 00:14:12.108 "zone_management": false, 00:14:12.108 "zone_append": false, 00:14:12.108 "compare": false, 00:14:12.108 "compare_and_write": false, 00:14:12.108 "abort": true, 00:14:12.108 "seek_hole": false, 00:14:12.108 "seek_data": false, 00:14:12.108 "copy": true, 00:14:12.108 "nvme_iov_md": false 00:14:12.108 }, 00:14:12.108 "memory_domains": [ 00:14:12.108 { 00:14:12.108 "dma_device_id": "system", 00:14:12.108 "dma_device_type": 1 00:14:12.108 }, 00:14:12.108 { 00:14:12.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.108 "dma_device_type": 2 00:14:12.108 } 00:14:12.108 ], 00:14:12.108 "driver_specific": { 00:14:12.108 "passthru": { 00:14:12.108 "name": "pt1", 00:14:12.108 "base_bdev_name": "malloc1" 00:14:12.108 } 00:14:12.108 } 00:14:12.108 }' 00:14:12.108 00:08:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.108 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.366 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.366 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.366 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.366 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.366 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.366 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.366 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.366 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.366 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.624 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.624 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.624 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:12.624 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.882 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.882 "name": "pt2", 00:14:12.882 "aliases": [ 00:14:12.882 "00000000-0000-0000-0000-000000000002" 00:14:12.882 ], 00:14:12.882 "product_name": "passthru", 00:14:12.882 "block_size": 512, 00:14:12.882 "num_blocks": 65536, 00:14:12.882 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:12.882 "assigned_rate_limits": { 00:14:12.882 "rw_ios_per_sec": 0, 00:14:12.882 "rw_mbytes_per_sec": 0, 00:14:12.882 "r_mbytes_per_sec": 0, 00:14:12.882 "w_mbytes_per_sec": 0 00:14:12.882 }, 00:14:12.882 "claimed": true, 00:14:12.882 "claim_type": "exclusive_write", 00:14:12.882 "zoned": false, 00:14:12.882 "supported_io_types": { 00:14:12.882 "read": true, 00:14:12.882 "write": true, 00:14:12.882 "unmap": true, 00:14:12.882 "flush": true, 00:14:12.882 "reset": true, 00:14:12.882 "nvme_admin": false, 00:14:12.882 "nvme_io": false, 00:14:12.882 "nvme_io_md": false, 00:14:12.882 "write_zeroes": true, 00:14:12.882 "zcopy": true, 00:14:12.882 "get_zone_info": false, 00:14:12.882 "zone_management": false, 00:14:12.882 "zone_append": false, 00:14:12.882 "compare": false, 00:14:12.882 "compare_and_write": false, 00:14:12.882 "abort": true, 00:14:12.882 "seek_hole": false, 00:14:12.882 "seek_data": false, 00:14:12.882 "copy": true, 00:14:12.882 "nvme_iov_md": false 00:14:12.882 }, 00:14:12.882 "memory_domains": [ 00:14:12.882 { 00:14:12.882 "dma_device_id": "system", 00:14:12.882 "dma_device_type": 1 00:14:12.882 }, 00:14:12.882 { 00:14:12.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.882 "dma_device_type": 2 00:14:12.882 } 00:14:12.882 ], 00:14:12.882 "driver_specific": { 00:14:12.882 "passthru": { 00:14:12.882 "name": "pt2", 00:14:12.882 "base_bdev_name": "malloc2" 00:14:12.882 } 00:14:12.882 } 00:14:12.882 }' 00:14:12.882 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.882 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.882 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.882 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.882 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.882 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.882 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.882 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.140 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.140 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.140 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.140 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.140 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:13.140 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:13.140 00:08:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:13.398 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:13.398 "name": "pt3", 00:14:13.398 "aliases": [ 00:14:13.398 "00000000-0000-0000-0000-000000000003" 00:14:13.398 ], 00:14:13.399 "product_name": "passthru", 00:14:13.399 "block_size": 512, 00:14:13.399 "num_blocks": 65536, 00:14:13.399 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:13.399 "assigned_rate_limits": { 00:14:13.399 "rw_ios_per_sec": 0, 00:14:13.399 "rw_mbytes_per_sec": 0, 00:14:13.399 "r_mbytes_per_sec": 0, 00:14:13.399 "w_mbytes_per_sec": 0 00:14:13.399 }, 00:14:13.399 "claimed": true, 00:14:13.399 "claim_type": "exclusive_write", 00:14:13.399 "zoned": false, 00:14:13.399 "supported_io_types": { 00:14:13.399 "read": true, 00:14:13.399 "write": true, 00:14:13.399 "unmap": true, 00:14:13.399 "flush": true, 00:14:13.399 "reset": true, 00:14:13.399 "nvme_admin": false, 00:14:13.399 "nvme_io": false, 00:14:13.399 "nvme_io_md": false, 00:14:13.399 "write_zeroes": true, 00:14:13.399 "zcopy": true, 00:14:13.399 "get_zone_info": false, 00:14:13.399 "zone_management": false, 00:14:13.399 "zone_append": false, 00:14:13.399 "compare": false, 00:14:13.399 "compare_and_write": false, 00:14:13.399 "abort": true, 00:14:13.399 "seek_hole": false, 00:14:13.399 "seek_data": false, 00:14:13.399 "copy": true, 00:14:13.399 "nvme_iov_md": false 00:14:13.399 }, 00:14:13.399 "memory_domains": [ 00:14:13.399 { 00:14:13.399 "dma_device_id": "system", 00:14:13.399 "dma_device_type": 1 00:14:13.399 }, 00:14:13.399 { 00:14:13.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.399 "dma_device_type": 2 00:14:13.399 } 00:14:13.399 ], 00:14:13.399 "driver_specific": { 00:14:13.399 "passthru": { 00:14:13.399 "name": "pt3", 00:14:13.399 "base_bdev_name": "malloc3" 00:14:13.399 } 00:14:13.399 } 00:14:13.399 }' 00:14:13.399 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.399 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.399 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.399 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.399 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.658 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.658 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.658 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.658 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.658 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.658 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.658 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.658 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:13.658 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:13.916 [2024-07-16 00:09:00.797969] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:13.916 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9dff35d8-c1ed-4f45-b278-9e0f6ae3c809 00:14:13.916 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 9dff35d8-c1ed-4f45-b278-9e0f6ae3c809 ']' 00:14:13.916 00:09:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:14.176 [2024-07-16 00:09:01.062377] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:14.177 [2024-07-16 00:09:01.062399] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:14.177 [2024-07-16 00:09:01.062450] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:14.177 [2024-07-16 00:09:01.062501] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:14.177 [2024-07-16 00:09:01.062513] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc26ea0 name raid_bdev1, state offline 00:14:14.177 00:09:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.177 00:09:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:14.434 00:09:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:14.434 00:09:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:14.434 00:09:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:14.434 00:09:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:14.692 00:09:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:14.692 00:09:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:14.950 00:09:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:14.950 00:09:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:15.208 00:09:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:15.208 00:09:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:15.470 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:15.785 [2024-07-16 00:09:02.574313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:15.785 [2024-07-16 00:09:02.575655] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:15.785 [2024-07-16 00:09:02.575699] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:15.785 [2024-07-16 00:09:02.575744] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:15.785 [2024-07-16 00:09:02.575784] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:15.785 [2024-07-16 00:09:02.575806] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:15.785 [2024-07-16 00:09:02.575824] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:15.785 [2024-07-16 00:09:02.575834] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdd1ff0 name raid_bdev1, state configuring 00:14:15.785 request: 00:14:15.785 { 00:14:15.785 "name": "raid_bdev1", 00:14:15.785 "raid_level": "raid0", 00:14:15.785 "base_bdevs": [ 00:14:15.785 "malloc1", 00:14:15.785 "malloc2", 00:14:15.785 "malloc3" 00:14:15.785 ], 00:14:15.785 "strip_size_kb": 64, 00:14:15.785 "superblock": false, 00:14:15.785 "method": "bdev_raid_create", 00:14:15.785 "req_id": 1 00:14:15.785 } 00:14:15.785 Got JSON-RPC error response 00:14:15.785 response: 00:14:15.785 { 00:14:15.785 "code": -17, 00:14:15.785 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:15.785 } 00:14:15.785 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:15.785 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:15.785 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:15.785 00:09:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:15.786 00:09:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.786 00:09:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:16.044 00:09:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:16.044 00:09:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:16.044 00:09:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:16.303 [2024-07-16 00:09:03.087605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:16.303 [2024-07-16 00:09:03.087649] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:16.303 [2024-07-16 00:09:03.087668] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc2e7a0 00:14:16.303 [2024-07-16 00:09:03.087681] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:16.303 [2024-07-16 00:09:03.089255] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:16.303 [2024-07-16 00:09:03.089285] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:16.303 [2024-07-16 00:09:03.089352] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:16.303 [2024-07-16 00:09:03.089376] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:16.303 pt1 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.303 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:16.561 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:16.561 "name": "raid_bdev1", 00:14:16.561 "uuid": "9dff35d8-c1ed-4f45-b278-9e0f6ae3c809", 00:14:16.561 "strip_size_kb": 64, 00:14:16.561 "state": "configuring", 00:14:16.561 "raid_level": "raid0", 00:14:16.561 "superblock": true, 00:14:16.561 "num_base_bdevs": 3, 00:14:16.561 "num_base_bdevs_discovered": 1, 00:14:16.561 "num_base_bdevs_operational": 3, 00:14:16.561 "base_bdevs_list": [ 00:14:16.561 { 00:14:16.561 "name": "pt1", 00:14:16.561 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:16.561 "is_configured": true, 00:14:16.561 "data_offset": 2048, 00:14:16.561 "data_size": 63488 00:14:16.561 }, 00:14:16.561 { 00:14:16.561 "name": null, 00:14:16.561 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:16.561 "is_configured": false, 00:14:16.561 "data_offset": 2048, 00:14:16.561 "data_size": 63488 00:14:16.561 }, 00:14:16.561 { 00:14:16.561 "name": null, 00:14:16.561 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:16.561 "is_configured": false, 00:14:16.561 "data_offset": 2048, 00:14:16.561 "data_size": 63488 00:14:16.561 } 00:14:16.561 ] 00:14:16.561 }' 00:14:16.561 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:16.561 00:09:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.127 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:17.127 00:09:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:17.386 [2024-07-16 00:09:04.182529] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:17.386 [2024-07-16 00:09:04.182582] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:17.386 [2024-07-16 00:09:04.182601] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc25c70 00:14:17.386 [2024-07-16 00:09:04.182614] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:17.386 [2024-07-16 00:09:04.182975] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:17.386 [2024-07-16 00:09:04.182996] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:17.386 [2024-07-16 00:09:04.183060] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:17.386 [2024-07-16 00:09:04.183078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:17.386 pt2 00:14:17.386 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:17.646 [2024-07-16 00:09:04.439213] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.646 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:17.906 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.906 "name": "raid_bdev1", 00:14:17.906 "uuid": "9dff35d8-c1ed-4f45-b278-9e0f6ae3c809", 00:14:17.906 "strip_size_kb": 64, 00:14:17.906 "state": "configuring", 00:14:17.906 "raid_level": "raid0", 00:14:17.906 "superblock": true, 00:14:17.906 "num_base_bdevs": 3, 00:14:17.906 "num_base_bdevs_discovered": 1, 00:14:17.906 "num_base_bdevs_operational": 3, 00:14:17.906 "base_bdevs_list": [ 00:14:17.906 { 00:14:17.906 "name": "pt1", 00:14:17.906 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:17.906 "is_configured": true, 00:14:17.906 "data_offset": 2048, 00:14:17.906 "data_size": 63488 00:14:17.906 }, 00:14:17.906 { 00:14:17.906 "name": null, 00:14:17.906 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:17.906 "is_configured": false, 00:14:17.906 "data_offset": 2048, 00:14:17.906 "data_size": 63488 00:14:17.906 }, 00:14:17.906 { 00:14:17.906 "name": null, 00:14:17.906 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:17.906 "is_configured": false, 00:14:17.906 "data_offset": 2048, 00:14:17.906 "data_size": 63488 00:14:17.906 } 00:14:17.906 ] 00:14:17.906 }' 00:14:17.906 00:09:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.906 00:09:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.474 00:09:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:18.474 00:09:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:18.474 00:09:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:19.041 [2024-07-16 00:09:05.814860] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:19.041 [2024-07-16 00:09:05.814917] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:19.041 [2024-07-16 00:09:05.814943] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc6fa0 00:14:19.041 [2024-07-16 00:09:05.814957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:19.041 [2024-07-16 00:09:05.815322] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:19.041 [2024-07-16 00:09:05.815342] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:19.041 [2024-07-16 00:09:05.815410] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:19.041 [2024-07-16 00:09:05.815429] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:19.041 pt2 00:14:19.041 00:09:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:19.041 00:09:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:19.041 00:09:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:19.300 [2024-07-16 00:09:06.075563] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:19.300 [2024-07-16 00:09:06.075603] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:19.300 [2024-07-16 00:09:06.075620] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc7b30 00:14:19.300 [2024-07-16 00:09:06.075633] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:19.300 [2024-07-16 00:09:06.075959] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:19.300 [2024-07-16 00:09:06.075977] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:19.300 [2024-07-16 00:09:06.076035] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:19.300 [2024-07-16 00:09:06.076053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:19.300 [2024-07-16 00:09:06.076159] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdc8c00 00:14:19.300 [2024-07-16 00:09:06.076169] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:19.300 [2024-07-16 00:09:06.076335] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd19b0 00:14:19.300 [2024-07-16 00:09:06.076456] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdc8c00 00:14:19.300 [2024-07-16 00:09:06.076465] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdc8c00 00:14:19.300 [2024-07-16 00:09:06.076560] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:19.301 pt3 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.301 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:19.560 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.560 "name": "raid_bdev1", 00:14:19.560 "uuid": "9dff35d8-c1ed-4f45-b278-9e0f6ae3c809", 00:14:19.560 "strip_size_kb": 64, 00:14:19.560 "state": "online", 00:14:19.560 "raid_level": "raid0", 00:14:19.560 "superblock": true, 00:14:19.560 "num_base_bdevs": 3, 00:14:19.560 "num_base_bdevs_discovered": 3, 00:14:19.560 "num_base_bdevs_operational": 3, 00:14:19.560 "base_bdevs_list": [ 00:14:19.560 { 00:14:19.560 "name": "pt1", 00:14:19.560 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:19.560 "is_configured": true, 00:14:19.560 "data_offset": 2048, 00:14:19.560 "data_size": 63488 00:14:19.560 }, 00:14:19.560 { 00:14:19.560 "name": "pt2", 00:14:19.560 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:19.560 "is_configured": true, 00:14:19.560 "data_offset": 2048, 00:14:19.560 "data_size": 63488 00:14:19.560 }, 00:14:19.560 { 00:14:19.560 "name": "pt3", 00:14:19.560 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:19.560 "is_configured": true, 00:14:19.560 "data_offset": 2048, 00:14:19.560 "data_size": 63488 00:14:19.560 } 00:14:19.560 ] 00:14:19.560 }' 00:14:19.560 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.560 00:09:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:20.128 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:20.128 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:20.128 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:20.128 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:20.128 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:20.128 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:20.128 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:20.128 00:09:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:20.387 [2024-07-16 00:09:07.198821] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:20.387 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:20.387 "name": "raid_bdev1", 00:14:20.387 "aliases": [ 00:14:20.387 "9dff35d8-c1ed-4f45-b278-9e0f6ae3c809" 00:14:20.387 ], 00:14:20.387 "product_name": "Raid Volume", 00:14:20.387 "block_size": 512, 00:14:20.387 "num_blocks": 190464, 00:14:20.387 "uuid": "9dff35d8-c1ed-4f45-b278-9e0f6ae3c809", 00:14:20.387 "assigned_rate_limits": { 00:14:20.387 "rw_ios_per_sec": 0, 00:14:20.387 "rw_mbytes_per_sec": 0, 00:14:20.387 "r_mbytes_per_sec": 0, 00:14:20.387 "w_mbytes_per_sec": 0 00:14:20.387 }, 00:14:20.387 "claimed": false, 00:14:20.387 "zoned": false, 00:14:20.387 "supported_io_types": { 00:14:20.387 "read": true, 00:14:20.387 "write": true, 00:14:20.387 "unmap": true, 00:14:20.387 "flush": true, 00:14:20.387 "reset": true, 00:14:20.387 "nvme_admin": false, 00:14:20.387 "nvme_io": false, 00:14:20.387 "nvme_io_md": false, 00:14:20.387 "write_zeroes": true, 00:14:20.387 "zcopy": false, 00:14:20.387 "get_zone_info": false, 00:14:20.387 "zone_management": false, 00:14:20.387 "zone_append": false, 00:14:20.387 "compare": false, 00:14:20.387 "compare_and_write": false, 00:14:20.387 "abort": false, 00:14:20.387 "seek_hole": false, 00:14:20.387 "seek_data": false, 00:14:20.387 "copy": false, 00:14:20.387 "nvme_iov_md": false 00:14:20.387 }, 00:14:20.387 "memory_domains": [ 00:14:20.387 { 00:14:20.387 "dma_device_id": "system", 00:14:20.387 "dma_device_type": 1 00:14:20.387 }, 00:14:20.387 { 00:14:20.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.387 "dma_device_type": 2 00:14:20.387 }, 00:14:20.387 { 00:14:20.387 "dma_device_id": "system", 00:14:20.387 "dma_device_type": 1 00:14:20.387 }, 00:14:20.387 { 00:14:20.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.387 "dma_device_type": 2 00:14:20.387 }, 00:14:20.387 { 00:14:20.387 "dma_device_id": "system", 00:14:20.387 "dma_device_type": 1 00:14:20.387 }, 00:14:20.387 { 00:14:20.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.387 "dma_device_type": 2 00:14:20.387 } 00:14:20.387 ], 00:14:20.387 "driver_specific": { 00:14:20.387 "raid": { 00:14:20.387 "uuid": "9dff35d8-c1ed-4f45-b278-9e0f6ae3c809", 00:14:20.387 "strip_size_kb": 64, 00:14:20.387 "state": "online", 00:14:20.388 "raid_level": "raid0", 00:14:20.388 "superblock": true, 00:14:20.388 "num_base_bdevs": 3, 00:14:20.388 "num_base_bdevs_discovered": 3, 00:14:20.388 "num_base_bdevs_operational": 3, 00:14:20.388 "base_bdevs_list": [ 00:14:20.388 { 00:14:20.388 "name": "pt1", 00:14:20.388 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:20.388 "is_configured": true, 00:14:20.388 "data_offset": 2048, 00:14:20.388 "data_size": 63488 00:14:20.388 }, 00:14:20.388 { 00:14:20.388 "name": "pt2", 00:14:20.388 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:20.388 "is_configured": true, 00:14:20.388 "data_offset": 2048, 00:14:20.388 "data_size": 63488 00:14:20.388 }, 00:14:20.388 { 00:14:20.388 "name": "pt3", 00:14:20.388 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:20.388 "is_configured": true, 00:14:20.388 "data_offset": 2048, 00:14:20.388 "data_size": 63488 00:14:20.388 } 00:14:20.388 ] 00:14:20.388 } 00:14:20.388 } 00:14:20.388 }' 00:14:20.388 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:20.388 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:20.388 pt2 00:14:20.388 pt3' 00:14:20.388 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:20.388 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:20.388 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:20.648 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:20.648 "name": "pt1", 00:14:20.648 "aliases": [ 00:14:20.648 "00000000-0000-0000-0000-000000000001" 00:14:20.648 ], 00:14:20.648 "product_name": "passthru", 00:14:20.648 "block_size": 512, 00:14:20.648 "num_blocks": 65536, 00:14:20.648 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:20.648 "assigned_rate_limits": { 00:14:20.648 "rw_ios_per_sec": 0, 00:14:20.648 "rw_mbytes_per_sec": 0, 00:14:20.648 "r_mbytes_per_sec": 0, 00:14:20.648 "w_mbytes_per_sec": 0 00:14:20.648 }, 00:14:20.648 "claimed": true, 00:14:20.648 "claim_type": "exclusive_write", 00:14:20.648 "zoned": false, 00:14:20.648 "supported_io_types": { 00:14:20.648 "read": true, 00:14:20.648 "write": true, 00:14:20.648 "unmap": true, 00:14:20.648 "flush": true, 00:14:20.648 "reset": true, 00:14:20.648 "nvme_admin": false, 00:14:20.648 "nvme_io": false, 00:14:20.648 "nvme_io_md": false, 00:14:20.648 "write_zeroes": true, 00:14:20.648 "zcopy": true, 00:14:20.648 "get_zone_info": false, 00:14:20.648 "zone_management": false, 00:14:20.648 "zone_append": false, 00:14:20.648 "compare": false, 00:14:20.648 "compare_and_write": false, 00:14:20.648 "abort": true, 00:14:20.648 "seek_hole": false, 00:14:20.648 "seek_data": false, 00:14:20.648 "copy": true, 00:14:20.648 "nvme_iov_md": false 00:14:20.648 }, 00:14:20.648 "memory_domains": [ 00:14:20.648 { 00:14:20.648 "dma_device_id": "system", 00:14:20.648 "dma_device_type": 1 00:14:20.648 }, 00:14:20.648 { 00:14:20.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.648 "dma_device_type": 2 00:14:20.648 } 00:14:20.648 ], 00:14:20.648 "driver_specific": { 00:14:20.648 "passthru": { 00:14:20.648 "name": "pt1", 00:14:20.648 "base_bdev_name": "malloc1" 00:14:20.648 } 00:14:20.648 } 00:14:20.648 }' 00:14:20.648 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.648 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.907 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:20.907 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.907 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.907 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:20.907 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.907 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.907 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:20.907 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.907 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.907 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:20.908 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:20.908 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:20.908 00:09:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:21.167 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:21.167 "name": "pt2", 00:14:21.167 "aliases": [ 00:14:21.167 "00000000-0000-0000-0000-000000000002" 00:14:21.167 ], 00:14:21.167 "product_name": "passthru", 00:14:21.167 "block_size": 512, 00:14:21.167 "num_blocks": 65536, 00:14:21.167 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:21.167 "assigned_rate_limits": { 00:14:21.167 "rw_ios_per_sec": 0, 00:14:21.167 "rw_mbytes_per_sec": 0, 00:14:21.167 "r_mbytes_per_sec": 0, 00:14:21.167 "w_mbytes_per_sec": 0 00:14:21.167 }, 00:14:21.167 "claimed": true, 00:14:21.167 "claim_type": "exclusive_write", 00:14:21.167 "zoned": false, 00:14:21.167 "supported_io_types": { 00:14:21.167 "read": true, 00:14:21.167 "write": true, 00:14:21.167 "unmap": true, 00:14:21.167 "flush": true, 00:14:21.167 "reset": true, 00:14:21.167 "nvme_admin": false, 00:14:21.167 "nvme_io": false, 00:14:21.167 "nvme_io_md": false, 00:14:21.167 "write_zeroes": true, 00:14:21.167 "zcopy": true, 00:14:21.167 "get_zone_info": false, 00:14:21.167 "zone_management": false, 00:14:21.167 "zone_append": false, 00:14:21.167 "compare": false, 00:14:21.167 "compare_and_write": false, 00:14:21.167 "abort": true, 00:14:21.167 "seek_hole": false, 00:14:21.167 "seek_data": false, 00:14:21.167 "copy": true, 00:14:21.167 "nvme_iov_md": false 00:14:21.167 }, 00:14:21.167 "memory_domains": [ 00:14:21.167 { 00:14:21.167 "dma_device_id": "system", 00:14:21.167 "dma_device_type": 1 00:14:21.167 }, 00:14:21.167 { 00:14:21.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.167 "dma_device_type": 2 00:14:21.167 } 00:14:21.167 ], 00:14:21.167 "driver_specific": { 00:14:21.167 "passthru": { 00:14:21.167 "name": "pt2", 00:14:21.167 "base_bdev_name": "malloc2" 00:14:21.167 } 00:14:21.167 } 00:14:21.167 }' 00:14:21.167 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.426 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.426 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:21.426 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.426 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.426 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:21.426 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.426 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.426 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:21.685 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.685 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.685 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:21.685 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:21.685 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:21.685 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:21.944 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:21.944 "name": "pt3", 00:14:21.944 "aliases": [ 00:14:21.944 "00000000-0000-0000-0000-000000000003" 00:14:21.944 ], 00:14:21.944 "product_name": "passthru", 00:14:21.944 "block_size": 512, 00:14:21.944 "num_blocks": 65536, 00:14:21.944 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:21.944 "assigned_rate_limits": { 00:14:21.944 "rw_ios_per_sec": 0, 00:14:21.944 "rw_mbytes_per_sec": 0, 00:14:21.944 "r_mbytes_per_sec": 0, 00:14:21.944 "w_mbytes_per_sec": 0 00:14:21.944 }, 00:14:21.944 "claimed": true, 00:14:21.944 "claim_type": "exclusive_write", 00:14:21.944 "zoned": false, 00:14:21.944 "supported_io_types": { 00:14:21.944 "read": true, 00:14:21.944 "write": true, 00:14:21.944 "unmap": true, 00:14:21.944 "flush": true, 00:14:21.944 "reset": true, 00:14:21.944 "nvme_admin": false, 00:14:21.944 "nvme_io": false, 00:14:21.944 "nvme_io_md": false, 00:14:21.944 "write_zeroes": true, 00:14:21.944 "zcopy": true, 00:14:21.944 "get_zone_info": false, 00:14:21.944 "zone_management": false, 00:14:21.944 "zone_append": false, 00:14:21.944 "compare": false, 00:14:21.944 "compare_and_write": false, 00:14:21.944 "abort": true, 00:14:21.944 "seek_hole": false, 00:14:21.944 "seek_data": false, 00:14:21.944 "copy": true, 00:14:21.944 "nvme_iov_md": false 00:14:21.944 }, 00:14:21.944 "memory_domains": [ 00:14:21.944 { 00:14:21.944 "dma_device_id": "system", 00:14:21.944 "dma_device_type": 1 00:14:21.944 }, 00:14:21.944 { 00:14:21.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.944 "dma_device_type": 2 00:14:21.944 } 00:14:21.944 ], 00:14:21.944 "driver_specific": { 00:14:21.944 "passthru": { 00:14:21.944 "name": "pt3", 00:14:21.944 "base_bdev_name": "malloc3" 00:14:21.944 } 00:14:21.944 } 00:14:21.944 }' 00:14:21.944 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.944 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.944 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:21.944 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.944 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:22.203 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:22.203 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:22.203 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:22.203 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:22.203 00:09:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:22.203 00:09:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:22.203 00:09:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:22.203 00:09:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:22.203 00:09:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:22.494 [2024-07-16 00:09:09.304386] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 9dff35d8-c1ed-4f45-b278-9e0f6ae3c809 '!=' 9dff35d8-c1ed-4f45-b278-9e0f6ae3c809 ']' 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3517755 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3517755 ']' 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3517755 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3517755 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3517755' 00:14:22.494 killing process with pid 3517755 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3517755 00:14:22.494 [2024-07-16 00:09:09.391777] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:22.494 [2024-07-16 00:09:09.391831] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:22.494 [2024-07-16 00:09:09.391880] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:22.494 [2024-07-16 00:09:09.391892] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdc8c00 name raid_bdev1, state offline 00:14:22.494 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3517755 00:14:22.754 [2024-07-16 00:09:09.420298] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:22.754 00:09:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:22.754 00:14:22.754 real 0m14.846s 00:14:22.754 user 0m26.719s 00:14:22.754 sys 0m2.699s 00:14:22.754 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:22.754 00:09:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.754 ************************************ 00:14:22.754 END TEST raid_superblock_test 00:14:22.754 ************************************ 00:14:22.754 00:09:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:22.754 00:09:09 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:14:22.754 00:09:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:22.754 00:09:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:22.754 00:09:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:23.014 ************************************ 00:14:23.014 START TEST raid_read_error_test 00:14:23.014 ************************************ 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.VQcbif1lez 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3520485 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3520485 /var/tmp/spdk-raid.sock 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3520485 ']' 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:23.014 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:23.014 00:09:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.014 [2024-07-16 00:09:09.797295] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:14:23.014 [2024-07-16 00:09:09.797346] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3520485 ] 00:14:23.014 [2024-07-16 00:09:09.909232] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:23.273 [2024-07-16 00:09:10.012827] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:23.273 [2024-07-16 00:09:10.077772] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:23.273 [2024-07-16 00:09:10.077811] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:23.840 00:09:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:23.840 00:09:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:23.840 00:09:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:23.840 00:09:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:24.098 BaseBdev1_malloc 00:14:24.098 00:09:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:24.356 true 00:14:24.356 00:09:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:24.615 [2024-07-16 00:09:11.487380] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:24.615 [2024-07-16 00:09:11.487430] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:24.615 [2024-07-16 00:09:11.487450] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd60d0 00:14:24.615 [2024-07-16 00:09:11.487463] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:24.615 [2024-07-16 00:09:11.489245] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:24.615 [2024-07-16 00:09:11.489277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:24.615 BaseBdev1 00:14:24.615 00:09:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:24.615 00:09:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:24.874 BaseBdev2_malloc 00:14:24.874 00:09:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:25.132 true 00:14:25.132 00:09:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:25.390 [2024-07-16 00:09:12.241956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:25.390 [2024-07-16 00:09:12.242002] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:25.390 [2024-07-16 00:09:12.242022] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfda910 00:14:25.390 [2024-07-16 00:09:12.242035] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:25.390 [2024-07-16 00:09:12.243395] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:25.390 [2024-07-16 00:09:12.243423] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:25.390 BaseBdev2 00:14:25.390 00:09:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:25.390 00:09:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:25.648 BaseBdev3_malloc 00:14:25.648 00:09:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:25.906 true 00:14:25.906 00:09:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:26.164 [2024-07-16 00:09:13.004652] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:26.164 [2024-07-16 00:09:13.004699] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:26.164 [2024-07-16 00:09:13.004719] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfdcbd0 00:14:26.164 [2024-07-16 00:09:13.004731] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:26.164 [2024-07-16 00:09:13.006210] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:26.164 [2024-07-16 00:09:13.006241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:26.164 BaseBdev3 00:14:26.164 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:26.425 [2024-07-16 00:09:13.265375] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:26.425 [2024-07-16 00:09:13.266625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:26.425 [2024-07-16 00:09:13.266696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:26.425 [2024-07-16 00:09:13.266905] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfde280 00:14:26.425 [2024-07-16 00:09:13.266916] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:26.425 [2024-07-16 00:09:13.267115] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfdde20 00:14:26.425 [2024-07-16 00:09:13.267260] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfde280 00:14:26.425 [2024-07-16 00:09:13.267270] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfde280 00:14:26.425 [2024-07-16 00:09:13.267369] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.425 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:26.684 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.684 "name": "raid_bdev1", 00:14:26.684 "uuid": "a645e532-3287-48a8-9a80-842a30a75b98", 00:14:26.684 "strip_size_kb": 64, 00:14:26.684 "state": "online", 00:14:26.684 "raid_level": "raid0", 00:14:26.684 "superblock": true, 00:14:26.684 "num_base_bdevs": 3, 00:14:26.684 "num_base_bdevs_discovered": 3, 00:14:26.684 "num_base_bdevs_operational": 3, 00:14:26.684 "base_bdevs_list": [ 00:14:26.684 { 00:14:26.684 "name": "BaseBdev1", 00:14:26.684 "uuid": "f62eea4f-374d-53e9-8442-cdbd4c1771c8", 00:14:26.684 "is_configured": true, 00:14:26.684 "data_offset": 2048, 00:14:26.684 "data_size": 63488 00:14:26.684 }, 00:14:26.684 { 00:14:26.684 "name": "BaseBdev2", 00:14:26.684 "uuid": "11be9d3d-c1b6-5ce7-8457-33113de6cadf", 00:14:26.684 "is_configured": true, 00:14:26.684 "data_offset": 2048, 00:14:26.684 "data_size": 63488 00:14:26.684 }, 00:14:26.684 { 00:14:26.684 "name": "BaseBdev3", 00:14:26.684 "uuid": "13f3edd5-6c35-5985-9fb4-f628a0343b9a", 00:14:26.684 "is_configured": true, 00:14:26.684 "data_offset": 2048, 00:14:26.684 "data_size": 63488 00:14:26.684 } 00:14:26.684 ] 00:14:26.684 }' 00:14:26.684 00:09:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.684 00:09:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.250 00:09:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:27.250 00:09:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:27.509 [2024-07-16 00:09:14.240242] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe2c5b0 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.446 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.706 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.706 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:28.706 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.706 "name": "raid_bdev1", 00:14:28.706 "uuid": "a645e532-3287-48a8-9a80-842a30a75b98", 00:14:28.706 "strip_size_kb": 64, 00:14:28.706 "state": "online", 00:14:28.706 "raid_level": "raid0", 00:14:28.706 "superblock": true, 00:14:28.706 "num_base_bdevs": 3, 00:14:28.706 "num_base_bdevs_discovered": 3, 00:14:28.706 "num_base_bdevs_operational": 3, 00:14:28.706 "base_bdevs_list": [ 00:14:28.706 { 00:14:28.706 "name": "BaseBdev1", 00:14:28.706 "uuid": "f62eea4f-374d-53e9-8442-cdbd4c1771c8", 00:14:28.706 "is_configured": true, 00:14:28.706 "data_offset": 2048, 00:14:28.706 "data_size": 63488 00:14:28.706 }, 00:14:28.706 { 00:14:28.706 "name": "BaseBdev2", 00:14:28.706 "uuid": "11be9d3d-c1b6-5ce7-8457-33113de6cadf", 00:14:28.706 "is_configured": true, 00:14:28.706 "data_offset": 2048, 00:14:28.706 "data_size": 63488 00:14:28.706 }, 00:14:28.706 { 00:14:28.706 "name": "BaseBdev3", 00:14:28.706 "uuid": "13f3edd5-6c35-5985-9fb4-f628a0343b9a", 00:14:28.706 "is_configured": true, 00:14:28.706 "data_offset": 2048, 00:14:28.706 "data_size": 63488 00:14:28.706 } 00:14:28.706 ] 00:14:28.706 }' 00:14:28.706 00:09:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.706 00:09:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:29.645 [2024-07-16 00:09:16.448616] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:29.645 [2024-07-16 00:09:16.448660] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:29.645 [2024-07-16 00:09:16.451830] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:29.645 [2024-07-16 00:09:16.451867] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:29.645 [2024-07-16 00:09:16.451901] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:29.645 [2024-07-16 00:09:16.451912] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfde280 name raid_bdev1, state offline 00:14:29.645 0 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3520485 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3520485 ']' 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3520485 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3520485 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3520485' 00:14:29.645 killing process with pid 3520485 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3520485 00:14:29.645 [2024-07-16 00:09:16.531832] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:29.645 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3520485 00:14:29.645 [2024-07-16 00:09:16.552800] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:29.904 00:09:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.VQcbif1lez 00:14:29.904 00:09:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:29.904 00:09:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:29.904 00:09:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:14:29.904 00:09:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:29.904 00:09:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:29.904 00:09:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:29.904 00:09:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:14:29.904 00:14:29.904 real 0m7.067s 00:14:29.904 user 0m11.236s 00:14:29.904 sys 0m1.226s 00:14:29.904 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:29.904 00:09:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.904 ************************************ 00:14:29.904 END TEST raid_read_error_test 00:14:29.904 ************************************ 00:14:29.904 00:09:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:29.904 00:09:16 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:14:29.904 00:09:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:29.904 00:09:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:29.904 00:09:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:30.164 ************************************ 00:14:30.164 START TEST raid_write_error_test 00:14:30.164 ************************************ 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.s7C14iXwhp 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3521464 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3521464 /var/tmp/spdk-raid.sock 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3521464 ']' 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:30.164 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:30.164 00:09:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:30.164 [2024-07-16 00:09:16.962331] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:14:30.164 [2024-07-16 00:09:16.962405] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3521464 ] 00:14:30.164 [2024-07-16 00:09:17.093153] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:30.424 [2024-07-16 00:09:17.195380] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:30.424 [2024-07-16 00:09:17.254965] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:30.424 [2024-07-16 00:09:17.254997] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:30.993 00:09:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:30.993 00:09:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:30.993 00:09:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:30.993 00:09:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:31.252 BaseBdev1_malloc 00:14:31.252 00:09:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:31.511 true 00:14:31.511 00:09:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:31.511 [2024-07-16 00:09:18.455306] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:31.511 [2024-07-16 00:09:18.455355] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:31.511 [2024-07-16 00:09:18.455375] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28ad0d0 00:14:31.511 [2024-07-16 00:09:18.455388] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:31.511 [2024-07-16 00:09:18.457190] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:31.511 [2024-07-16 00:09:18.457222] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:31.511 BaseBdev1 00:14:31.771 00:09:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:31.771 00:09:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:31.771 BaseBdev2_malloc 00:14:31.771 00:09:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:32.029 true 00:14:32.029 00:09:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:32.289 [2024-07-16 00:09:19.005401] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:32.289 [2024-07-16 00:09:19.005444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:32.289 [2024-07-16 00:09:19.005465] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28b1910 00:14:32.289 [2024-07-16 00:09:19.005477] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:32.289 [2024-07-16 00:09:19.006953] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:32.289 [2024-07-16 00:09:19.006982] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:32.289 BaseBdev2 00:14:32.289 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:32.289 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:32.289 BaseBdev3_malloc 00:14:32.289 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:32.548 true 00:14:32.548 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:32.807 [2024-07-16 00:09:19.543433] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:32.808 [2024-07-16 00:09:19.543478] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:32.808 [2024-07-16 00:09:19.543496] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28b3bd0 00:14:32.808 [2024-07-16 00:09:19.543509] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:32.808 [2024-07-16 00:09:19.544972] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:32.808 [2024-07-16 00:09:19.545003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:32.808 BaseBdev3 00:14:32.808 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:32.808 [2024-07-16 00:09:19.727981] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:32.808 [2024-07-16 00:09:19.729214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:32.808 [2024-07-16 00:09:19.729282] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:32.808 [2024-07-16 00:09:19.729486] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x28b5280 00:14:32.808 [2024-07-16 00:09:19.729497] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:32.808 [2024-07-16 00:09:19.729687] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28b4e20 00:14:32.808 [2024-07-16 00:09:19.729831] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28b5280 00:14:32.808 [2024-07-16 00:09:19.729841] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28b5280 00:14:32.808 [2024-07-16 00:09:19.729947] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.067 "name": "raid_bdev1", 00:14:33.067 "uuid": "ad55d034-0278-40e1-a81a-5a9325570943", 00:14:33.067 "strip_size_kb": 64, 00:14:33.067 "state": "online", 00:14:33.067 "raid_level": "raid0", 00:14:33.067 "superblock": true, 00:14:33.067 "num_base_bdevs": 3, 00:14:33.067 "num_base_bdevs_discovered": 3, 00:14:33.067 "num_base_bdevs_operational": 3, 00:14:33.067 "base_bdevs_list": [ 00:14:33.067 { 00:14:33.067 "name": "BaseBdev1", 00:14:33.067 "uuid": "b6e13c1a-c05e-5631-b056-211a6b56520e", 00:14:33.067 "is_configured": true, 00:14:33.067 "data_offset": 2048, 00:14:33.067 "data_size": 63488 00:14:33.067 }, 00:14:33.067 { 00:14:33.067 "name": "BaseBdev2", 00:14:33.067 "uuid": "feb79fc2-431d-57b3-a3f5-d29776e77fa4", 00:14:33.067 "is_configured": true, 00:14:33.067 "data_offset": 2048, 00:14:33.067 "data_size": 63488 00:14:33.067 }, 00:14:33.067 { 00:14:33.067 "name": "BaseBdev3", 00:14:33.067 "uuid": "93266ece-0793-594e-9371-a31eeb384b43", 00:14:33.067 "is_configured": true, 00:14:33.067 "data_offset": 2048, 00:14:33.067 "data_size": 63488 00:14:33.067 } 00:14:33.067 ] 00:14:33.067 }' 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.067 00:09:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.635 00:09:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:33.635 00:09:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:33.894 [2024-07-16 00:09:20.690806] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27035b0 00:14:34.832 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.092 "name": "raid_bdev1", 00:14:35.092 "uuid": "ad55d034-0278-40e1-a81a-5a9325570943", 00:14:35.092 "strip_size_kb": 64, 00:14:35.092 "state": "online", 00:14:35.092 "raid_level": "raid0", 00:14:35.092 "superblock": true, 00:14:35.092 "num_base_bdevs": 3, 00:14:35.092 "num_base_bdevs_discovered": 3, 00:14:35.092 "num_base_bdevs_operational": 3, 00:14:35.092 "base_bdevs_list": [ 00:14:35.092 { 00:14:35.092 "name": "BaseBdev1", 00:14:35.092 "uuid": "b6e13c1a-c05e-5631-b056-211a6b56520e", 00:14:35.092 "is_configured": true, 00:14:35.092 "data_offset": 2048, 00:14:35.092 "data_size": 63488 00:14:35.092 }, 00:14:35.092 { 00:14:35.092 "name": "BaseBdev2", 00:14:35.092 "uuid": "feb79fc2-431d-57b3-a3f5-d29776e77fa4", 00:14:35.092 "is_configured": true, 00:14:35.092 "data_offset": 2048, 00:14:35.092 "data_size": 63488 00:14:35.092 }, 00:14:35.092 { 00:14:35.092 "name": "BaseBdev3", 00:14:35.092 "uuid": "93266ece-0793-594e-9371-a31eeb384b43", 00:14:35.092 "is_configured": true, 00:14:35.092 "data_offset": 2048, 00:14:35.092 "data_size": 63488 00:14:35.092 } 00:14:35.092 ] 00:14:35.092 }' 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.092 00:09:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.661 00:09:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:35.920 [2024-07-16 00:09:22.734724] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:35.920 [2024-07-16 00:09:22.734765] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:35.920 [2024-07-16 00:09:22.737959] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:35.920 [2024-07-16 00:09:22.737996] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:35.920 [2024-07-16 00:09:22.738032] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:35.920 [2024-07-16 00:09:22.738043] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28b5280 name raid_bdev1, state offline 00:14:35.920 0 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3521464 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3521464 ']' 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3521464 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3521464 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3521464' 00:14:35.920 killing process with pid 3521464 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3521464 00:14:35.920 [2024-07-16 00:09:22.809466] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:35.920 00:09:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3521464 00:14:35.920 [2024-07-16 00:09:22.831000] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:36.179 00:09:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:36.179 00:09:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.s7C14iXwhp 00:14:36.179 00:09:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:36.179 00:09:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:14:36.179 00:09:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:36.179 00:09:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:36.179 00:09:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:36.179 00:09:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:14:36.179 00:14:36.179 real 0m6.192s 00:14:36.179 user 0m9.591s 00:14:36.179 sys 0m1.118s 00:14:36.179 00:09:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:36.179 00:09:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:36.179 ************************************ 00:14:36.179 END TEST raid_write_error_test 00:14:36.179 ************************************ 00:14:36.179 00:09:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:36.179 00:09:23 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:36.179 00:09:23 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:14:36.179 00:09:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:36.179 00:09:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:36.179 00:09:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:36.438 ************************************ 00:14:36.438 START TEST raid_state_function_test 00:14:36.438 ************************************ 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:36.438 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3522433 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3522433' 00:14:36.439 Process raid pid: 3522433 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3522433 /var/tmp/spdk-raid.sock 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3522433 ']' 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:36.439 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:36.439 00:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:36.439 [2024-07-16 00:09:23.233443] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:14:36.439 [2024-07-16 00:09:23.233512] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:36.439 [2024-07-16 00:09:23.362585] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:36.697 [2024-07-16 00:09:23.466452] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:36.697 [2024-07-16 00:09:23.532644] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:36.697 [2024-07-16 00:09:23.532672] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:37.265 00:09:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:37.265 00:09:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:37.265 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:37.833 [2024-07-16 00:09:24.616733] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:37.833 [2024-07-16 00:09:24.616776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:37.833 [2024-07-16 00:09:24.616792] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:37.833 [2024-07-16 00:09:24.616804] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:37.833 [2024-07-16 00:09:24.616813] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:37.833 [2024-07-16 00:09:24.616824] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.833 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.092 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:38.092 "name": "Existed_Raid", 00:14:38.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.092 "strip_size_kb": 64, 00:14:38.092 "state": "configuring", 00:14:38.092 "raid_level": "concat", 00:14:38.092 "superblock": false, 00:14:38.092 "num_base_bdevs": 3, 00:14:38.092 "num_base_bdevs_discovered": 0, 00:14:38.092 "num_base_bdevs_operational": 3, 00:14:38.092 "base_bdevs_list": [ 00:14:38.092 { 00:14:38.092 "name": "BaseBdev1", 00:14:38.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.092 "is_configured": false, 00:14:38.092 "data_offset": 0, 00:14:38.092 "data_size": 0 00:14:38.092 }, 00:14:38.092 { 00:14:38.092 "name": "BaseBdev2", 00:14:38.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.092 "is_configured": false, 00:14:38.092 "data_offset": 0, 00:14:38.092 "data_size": 0 00:14:38.092 }, 00:14:38.092 { 00:14:38.092 "name": "BaseBdev3", 00:14:38.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.092 "is_configured": false, 00:14:38.092 "data_offset": 0, 00:14:38.092 "data_size": 0 00:14:38.092 } 00:14:38.092 ] 00:14:38.092 }' 00:14:38.092 00:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:38.092 00:09:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.660 00:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:38.932 [2024-07-16 00:09:25.711492] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:38.932 [2024-07-16 00:09:25.711525] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21a9a80 name Existed_Raid, state configuring 00:14:38.932 00:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:39.204 [2024-07-16 00:09:25.960159] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:39.204 [2024-07-16 00:09:25.960187] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:39.204 [2024-07-16 00:09:25.960197] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:39.204 [2024-07-16 00:09:25.960208] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:39.204 [2024-07-16 00:09:25.960226] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:39.204 [2024-07-16 00:09:25.960237] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:39.204 00:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:39.463 [2024-07-16 00:09:26.218730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:39.463 BaseBdev1 00:14:39.463 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:39.463 00:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:39.463 00:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:39.463 00:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:39.463 00:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:39.463 00:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:39.463 00:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.722 00:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:39.981 [ 00:14:39.981 { 00:14:39.981 "name": "BaseBdev1", 00:14:39.981 "aliases": [ 00:14:39.981 "4321a7d5-bcbc-4c81-9010-c3f3806ce261" 00:14:39.981 ], 00:14:39.981 "product_name": "Malloc disk", 00:14:39.981 "block_size": 512, 00:14:39.981 "num_blocks": 65536, 00:14:39.981 "uuid": "4321a7d5-bcbc-4c81-9010-c3f3806ce261", 00:14:39.981 "assigned_rate_limits": { 00:14:39.981 "rw_ios_per_sec": 0, 00:14:39.981 "rw_mbytes_per_sec": 0, 00:14:39.981 "r_mbytes_per_sec": 0, 00:14:39.981 "w_mbytes_per_sec": 0 00:14:39.981 }, 00:14:39.981 "claimed": true, 00:14:39.981 "claim_type": "exclusive_write", 00:14:39.981 "zoned": false, 00:14:39.981 "supported_io_types": { 00:14:39.981 "read": true, 00:14:39.981 "write": true, 00:14:39.981 "unmap": true, 00:14:39.981 "flush": true, 00:14:39.981 "reset": true, 00:14:39.981 "nvme_admin": false, 00:14:39.981 "nvme_io": false, 00:14:39.981 "nvme_io_md": false, 00:14:39.981 "write_zeroes": true, 00:14:39.981 "zcopy": true, 00:14:39.981 "get_zone_info": false, 00:14:39.981 "zone_management": false, 00:14:39.981 "zone_append": false, 00:14:39.981 "compare": false, 00:14:39.981 "compare_and_write": false, 00:14:39.981 "abort": true, 00:14:39.981 "seek_hole": false, 00:14:39.981 "seek_data": false, 00:14:39.981 "copy": true, 00:14:39.981 "nvme_iov_md": false 00:14:39.981 }, 00:14:39.981 "memory_domains": [ 00:14:39.981 { 00:14:39.981 "dma_device_id": "system", 00:14:39.981 "dma_device_type": 1 00:14:39.981 }, 00:14:39.981 { 00:14:39.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.981 "dma_device_type": 2 00:14:39.981 } 00:14:39.981 ], 00:14:39.981 "driver_specific": {} 00:14:39.981 } 00:14:39.981 ] 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.981 00:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.240 00:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.240 "name": "Existed_Raid", 00:14:40.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.240 "strip_size_kb": 64, 00:14:40.240 "state": "configuring", 00:14:40.240 "raid_level": "concat", 00:14:40.240 "superblock": false, 00:14:40.240 "num_base_bdevs": 3, 00:14:40.240 "num_base_bdevs_discovered": 1, 00:14:40.240 "num_base_bdevs_operational": 3, 00:14:40.240 "base_bdevs_list": [ 00:14:40.240 { 00:14:40.240 "name": "BaseBdev1", 00:14:40.240 "uuid": "4321a7d5-bcbc-4c81-9010-c3f3806ce261", 00:14:40.240 "is_configured": true, 00:14:40.240 "data_offset": 0, 00:14:40.240 "data_size": 65536 00:14:40.240 }, 00:14:40.240 { 00:14:40.240 "name": "BaseBdev2", 00:14:40.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.240 "is_configured": false, 00:14:40.240 "data_offset": 0, 00:14:40.240 "data_size": 0 00:14:40.240 }, 00:14:40.240 { 00:14:40.240 "name": "BaseBdev3", 00:14:40.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.240 "is_configured": false, 00:14:40.240 "data_offset": 0, 00:14:40.240 "data_size": 0 00:14:40.240 } 00:14:40.240 ] 00:14:40.240 }' 00:14:40.240 00:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.240 00:09:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.808 00:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:41.067 [2024-07-16 00:09:27.843037] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:41.067 [2024-07-16 00:09:27.843074] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21a9310 name Existed_Raid, state configuring 00:14:41.067 00:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:41.326 [2024-07-16 00:09:28.019539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:41.326 [2024-07-16 00:09:28.020967] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:41.326 [2024-07-16 00:09:28.020998] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:41.326 [2024-07-16 00:09:28.021008] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:41.326 [2024-07-16 00:09:28.021019] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:41.326 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.326 "name": "Existed_Raid", 00:14:41.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.326 "strip_size_kb": 64, 00:14:41.326 "state": "configuring", 00:14:41.326 "raid_level": "concat", 00:14:41.326 "superblock": false, 00:14:41.326 "num_base_bdevs": 3, 00:14:41.326 "num_base_bdevs_discovered": 1, 00:14:41.326 "num_base_bdevs_operational": 3, 00:14:41.326 "base_bdevs_list": [ 00:14:41.326 { 00:14:41.326 "name": "BaseBdev1", 00:14:41.326 "uuid": "4321a7d5-bcbc-4c81-9010-c3f3806ce261", 00:14:41.326 "is_configured": true, 00:14:41.326 "data_offset": 0, 00:14:41.326 "data_size": 65536 00:14:41.326 }, 00:14:41.326 { 00:14:41.326 "name": "BaseBdev2", 00:14:41.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.326 "is_configured": false, 00:14:41.326 "data_offset": 0, 00:14:41.326 "data_size": 0 00:14:41.327 }, 00:14:41.327 { 00:14:41.327 "name": "BaseBdev3", 00:14:41.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.327 "is_configured": false, 00:14:41.327 "data_offset": 0, 00:14:41.327 "data_size": 0 00:14:41.327 } 00:14:41.327 ] 00:14:41.327 }' 00:14:41.327 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.327 00:09:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.264 00:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:42.264 [2024-07-16 00:09:29.069687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:42.264 BaseBdev2 00:14:42.264 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:42.264 00:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:42.264 00:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:42.264 00:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:42.264 00:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:42.264 00:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:42.264 00:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:42.523 00:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:42.523 [ 00:14:42.523 { 00:14:42.523 "name": "BaseBdev2", 00:14:42.523 "aliases": [ 00:14:42.523 "02e76c15-68ea-4b67-b86e-93ed9b5cb986" 00:14:42.523 ], 00:14:42.523 "product_name": "Malloc disk", 00:14:42.523 "block_size": 512, 00:14:42.523 "num_blocks": 65536, 00:14:42.523 "uuid": "02e76c15-68ea-4b67-b86e-93ed9b5cb986", 00:14:42.523 "assigned_rate_limits": { 00:14:42.523 "rw_ios_per_sec": 0, 00:14:42.523 "rw_mbytes_per_sec": 0, 00:14:42.523 "r_mbytes_per_sec": 0, 00:14:42.523 "w_mbytes_per_sec": 0 00:14:42.523 }, 00:14:42.523 "claimed": true, 00:14:42.523 "claim_type": "exclusive_write", 00:14:42.523 "zoned": false, 00:14:42.523 "supported_io_types": { 00:14:42.523 "read": true, 00:14:42.523 "write": true, 00:14:42.523 "unmap": true, 00:14:42.523 "flush": true, 00:14:42.523 "reset": true, 00:14:42.523 "nvme_admin": false, 00:14:42.523 "nvme_io": false, 00:14:42.523 "nvme_io_md": false, 00:14:42.523 "write_zeroes": true, 00:14:42.523 "zcopy": true, 00:14:42.523 "get_zone_info": false, 00:14:42.523 "zone_management": false, 00:14:42.523 "zone_append": false, 00:14:42.523 "compare": false, 00:14:42.523 "compare_and_write": false, 00:14:42.523 "abort": true, 00:14:42.523 "seek_hole": false, 00:14:42.523 "seek_data": false, 00:14:42.523 "copy": true, 00:14:42.523 "nvme_iov_md": false 00:14:42.523 }, 00:14:42.523 "memory_domains": [ 00:14:42.523 { 00:14:42.523 "dma_device_id": "system", 00:14:42.523 "dma_device_type": 1 00:14:42.523 }, 00:14:42.523 { 00:14:42.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.523 "dma_device_type": 2 00:14:42.523 } 00:14:42.523 ], 00:14:42.523 "driver_specific": {} 00:14:42.523 } 00:14:42.523 ] 00:14:42.523 00:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:42.523 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:42.523 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:42.523 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:42.523 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:42.523 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:42.523 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:42.523 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:42.523 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:42.524 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:42.524 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:42.524 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:42.524 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:42.524 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.524 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.782 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.782 "name": "Existed_Raid", 00:14:42.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.782 "strip_size_kb": 64, 00:14:42.782 "state": "configuring", 00:14:42.782 "raid_level": "concat", 00:14:42.782 "superblock": false, 00:14:42.782 "num_base_bdevs": 3, 00:14:42.782 "num_base_bdevs_discovered": 2, 00:14:42.782 "num_base_bdevs_operational": 3, 00:14:42.782 "base_bdevs_list": [ 00:14:42.782 { 00:14:42.782 "name": "BaseBdev1", 00:14:42.783 "uuid": "4321a7d5-bcbc-4c81-9010-c3f3806ce261", 00:14:42.783 "is_configured": true, 00:14:42.783 "data_offset": 0, 00:14:42.783 "data_size": 65536 00:14:42.783 }, 00:14:42.783 { 00:14:42.783 "name": "BaseBdev2", 00:14:42.783 "uuid": "02e76c15-68ea-4b67-b86e-93ed9b5cb986", 00:14:42.783 "is_configured": true, 00:14:42.783 "data_offset": 0, 00:14:42.783 "data_size": 65536 00:14:42.783 }, 00:14:42.783 { 00:14:42.783 "name": "BaseBdev3", 00:14:42.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.783 "is_configured": false, 00:14:42.783 "data_offset": 0, 00:14:42.783 "data_size": 0 00:14:42.783 } 00:14:42.783 ] 00:14:42.783 }' 00:14:42.783 00:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.783 00:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.350 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:43.608 [2024-07-16 00:09:30.356542] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:43.608 [2024-07-16 00:09:30.356576] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21aa400 00:14:43.608 [2024-07-16 00:09:30.356585] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:43.608 [2024-07-16 00:09:30.356831] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a9ef0 00:14:43.608 [2024-07-16 00:09:30.356958] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21aa400 00:14:43.608 [2024-07-16 00:09:30.356968] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21aa400 00:14:43.608 [2024-07-16 00:09:30.357127] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:43.608 BaseBdev3 00:14:43.608 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:43.608 00:09:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:43.608 00:09:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:43.608 00:09:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:43.608 00:09:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:43.608 00:09:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:43.608 00:09:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:43.866 00:09:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:43.866 [ 00:14:43.866 { 00:14:43.866 "name": "BaseBdev3", 00:14:43.866 "aliases": [ 00:14:43.866 "cbfd2c6c-295b-42e6-8e4c-531fd549b13e" 00:14:43.866 ], 00:14:43.866 "product_name": "Malloc disk", 00:14:43.866 "block_size": 512, 00:14:43.866 "num_blocks": 65536, 00:14:43.866 "uuid": "cbfd2c6c-295b-42e6-8e4c-531fd549b13e", 00:14:43.866 "assigned_rate_limits": { 00:14:43.866 "rw_ios_per_sec": 0, 00:14:43.866 "rw_mbytes_per_sec": 0, 00:14:43.866 "r_mbytes_per_sec": 0, 00:14:43.866 "w_mbytes_per_sec": 0 00:14:43.867 }, 00:14:43.867 "claimed": true, 00:14:43.867 "claim_type": "exclusive_write", 00:14:43.867 "zoned": false, 00:14:43.867 "supported_io_types": { 00:14:43.867 "read": true, 00:14:43.867 "write": true, 00:14:43.867 "unmap": true, 00:14:43.867 "flush": true, 00:14:43.867 "reset": true, 00:14:43.867 "nvme_admin": false, 00:14:43.867 "nvme_io": false, 00:14:43.867 "nvme_io_md": false, 00:14:43.867 "write_zeroes": true, 00:14:43.867 "zcopy": true, 00:14:43.867 "get_zone_info": false, 00:14:43.867 "zone_management": false, 00:14:43.867 "zone_append": false, 00:14:43.867 "compare": false, 00:14:43.867 "compare_and_write": false, 00:14:43.867 "abort": true, 00:14:43.867 "seek_hole": false, 00:14:43.867 "seek_data": false, 00:14:43.867 "copy": true, 00:14:43.867 "nvme_iov_md": false 00:14:43.867 }, 00:14:43.867 "memory_domains": [ 00:14:43.867 { 00:14:43.867 "dma_device_id": "system", 00:14:43.867 "dma_device_type": 1 00:14:43.867 }, 00:14:43.867 { 00:14:43.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.867 "dma_device_type": 2 00:14:43.867 } 00:14:43.867 ], 00:14:43.867 "driver_specific": {} 00:14:43.867 } 00:14:43.867 ] 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.867 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.124 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:44.124 "name": "Existed_Raid", 00:14:44.124 "uuid": "e0e30402-cd83-4915-a51b-afa369bee9c6", 00:14:44.124 "strip_size_kb": 64, 00:14:44.124 "state": "online", 00:14:44.124 "raid_level": "concat", 00:14:44.124 "superblock": false, 00:14:44.124 "num_base_bdevs": 3, 00:14:44.124 "num_base_bdevs_discovered": 3, 00:14:44.124 "num_base_bdevs_operational": 3, 00:14:44.124 "base_bdevs_list": [ 00:14:44.124 { 00:14:44.124 "name": "BaseBdev1", 00:14:44.124 "uuid": "4321a7d5-bcbc-4c81-9010-c3f3806ce261", 00:14:44.124 "is_configured": true, 00:14:44.124 "data_offset": 0, 00:14:44.124 "data_size": 65536 00:14:44.124 }, 00:14:44.124 { 00:14:44.124 "name": "BaseBdev2", 00:14:44.124 "uuid": "02e76c15-68ea-4b67-b86e-93ed9b5cb986", 00:14:44.124 "is_configured": true, 00:14:44.124 "data_offset": 0, 00:14:44.124 "data_size": 65536 00:14:44.124 }, 00:14:44.124 { 00:14:44.124 "name": "BaseBdev3", 00:14:44.124 "uuid": "cbfd2c6c-295b-42e6-8e4c-531fd549b13e", 00:14:44.124 "is_configured": true, 00:14:44.125 "data_offset": 0, 00:14:44.125 "data_size": 65536 00:14:44.125 } 00:14:44.125 ] 00:14:44.125 }' 00:14:44.125 00:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:44.125 00:09:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:44.690 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:44.690 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:44.690 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:44.690 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:44.690 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:44.690 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:44.690 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:44.690 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:44.948 [2024-07-16 00:09:31.792646] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:44.948 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:44.948 "name": "Existed_Raid", 00:14:44.948 "aliases": [ 00:14:44.948 "e0e30402-cd83-4915-a51b-afa369bee9c6" 00:14:44.948 ], 00:14:44.948 "product_name": "Raid Volume", 00:14:44.948 "block_size": 512, 00:14:44.948 "num_blocks": 196608, 00:14:44.948 "uuid": "e0e30402-cd83-4915-a51b-afa369bee9c6", 00:14:44.948 "assigned_rate_limits": { 00:14:44.948 "rw_ios_per_sec": 0, 00:14:44.948 "rw_mbytes_per_sec": 0, 00:14:44.948 "r_mbytes_per_sec": 0, 00:14:44.948 "w_mbytes_per_sec": 0 00:14:44.948 }, 00:14:44.948 "claimed": false, 00:14:44.948 "zoned": false, 00:14:44.948 "supported_io_types": { 00:14:44.948 "read": true, 00:14:44.948 "write": true, 00:14:44.948 "unmap": true, 00:14:44.948 "flush": true, 00:14:44.948 "reset": true, 00:14:44.948 "nvme_admin": false, 00:14:44.948 "nvme_io": false, 00:14:44.948 "nvme_io_md": false, 00:14:44.948 "write_zeroes": true, 00:14:44.948 "zcopy": false, 00:14:44.948 "get_zone_info": false, 00:14:44.948 "zone_management": false, 00:14:44.948 "zone_append": false, 00:14:44.948 "compare": false, 00:14:44.948 "compare_and_write": false, 00:14:44.948 "abort": false, 00:14:44.948 "seek_hole": false, 00:14:44.948 "seek_data": false, 00:14:44.948 "copy": false, 00:14:44.948 "nvme_iov_md": false 00:14:44.948 }, 00:14:44.948 "memory_domains": [ 00:14:44.948 { 00:14:44.948 "dma_device_id": "system", 00:14:44.948 "dma_device_type": 1 00:14:44.948 }, 00:14:44.948 { 00:14:44.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.948 "dma_device_type": 2 00:14:44.948 }, 00:14:44.948 { 00:14:44.948 "dma_device_id": "system", 00:14:44.948 "dma_device_type": 1 00:14:44.948 }, 00:14:44.948 { 00:14:44.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.948 "dma_device_type": 2 00:14:44.948 }, 00:14:44.948 { 00:14:44.948 "dma_device_id": "system", 00:14:44.948 "dma_device_type": 1 00:14:44.948 }, 00:14:44.948 { 00:14:44.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.948 "dma_device_type": 2 00:14:44.948 } 00:14:44.948 ], 00:14:44.948 "driver_specific": { 00:14:44.948 "raid": { 00:14:44.948 "uuid": "e0e30402-cd83-4915-a51b-afa369bee9c6", 00:14:44.948 "strip_size_kb": 64, 00:14:44.948 "state": "online", 00:14:44.948 "raid_level": "concat", 00:14:44.948 "superblock": false, 00:14:44.948 "num_base_bdevs": 3, 00:14:44.948 "num_base_bdevs_discovered": 3, 00:14:44.948 "num_base_bdevs_operational": 3, 00:14:44.948 "base_bdevs_list": [ 00:14:44.948 { 00:14:44.948 "name": "BaseBdev1", 00:14:44.948 "uuid": "4321a7d5-bcbc-4c81-9010-c3f3806ce261", 00:14:44.948 "is_configured": true, 00:14:44.948 "data_offset": 0, 00:14:44.948 "data_size": 65536 00:14:44.948 }, 00:14:44.948 { 00:14:44.948 "name": "BaseBdev2", 00:14:44.948 "uuid": "02e76c15-68ea-4b67-b86e-93ed9b5cb986", 00:14:44.948 "is_configured": true, 00:14:44.948 "data_offset": 0, 00:14:44.948 "data_size": 65536 00:14:44.948 }, 00:14:44.948 { 00:14:44.948 "name": "BaseBdev3", 00:14:44.948 "uuid": "cbfd2c6c-295b-42e6-8e4c-531fd549b13e", 00:14:44.948 "is_configured": true, 00:14:44.948 "data_offset": 0, 00:14:44.948 "data_size": 65536 00:14:44.948 } 00:14:44.948 ] 00:14:44.948 } 00:14:44.948 } 00:14:44.948 }' 00:14:44.948 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:44.948 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:44.948 BaseBdev2 00:14:44.948 BaseBdev3' 00:14:44.948 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.948 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:44.948 00:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:45.206 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:45.206 "name": "BaseBdev1", 00:14:45.206 "aliases": [ 00:14:45.206 "4321a7d5-bcbc-4c81-9010-c3f3806ce261" 00:14:45.206 ], 00:14:45.206 "product_name": "Malloc disk", 00:14:45.206 "block_size": 512, 00:14:45.206 "num_blocks": 65536, 00:14:45.206 "uuid": "4321a7d5-bcbc-4c81-9010-c3f3806ce261", 00:14:45.206 "assigned_rate_limits": { 00:14:45.206 "rw_ios_per_sec": 0, 00:14:45.206 "rw_mbytes_per_sec": 0, 00:14:45.206 "r_mbytes_per_sec": 0, 00:14:45.206 "w_mbytes_per_sec": 0 00:14:45.206 }, 00:14:45.206 "claimed": true, 00:14:45.206 "claim_type": "exclusive_write", 00:14:45.206 "zoned": false, 00:14:45.206 "supported_io_types": { 00:14:45.206 "read": true, 00:14:45.206 "write": true, 00:14:45.206 "unmap": true, 00:14:45.206 "flush": true, 00:14:45.206 "reset": true, 00:14:45.206 "nvme_admin": false, 00:14:45.206 "nvme_io": false, 00:14:45.206 "nvme_io_md": false, 00:14:45.206 "write_zeroes": true, 00:14:45.206 "zcopy": true, 00:14:45.206 "get_zone_info": false, 00:14:45.206 "zone_management": false, 00:14:45.206 "zone_append": false, 00:14:45.206 "compare": false, 00:14:45.206 "compare_and_write": false, 00:14:45.206 "abort": true, 00:14:45.206 "seek_hole": false, 00:14:45.206 "seek_data": false, 00:14:45.206 "copy": true, 00:14:45.206 "nvme_iov_md": false 00:14:45.206 }, 00:14:45.206 "memory_domains": [ 00:14:45.206 { 00:14:45.206 "dma_device_id": "system", 00:14:45.206 "dma_device_type": 1 00:14:45.206 }, 00:14:45.206 { 00:14:45.206 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.206 "dma_device_type": 2 00:14:45.206 } 00:14:45.206 ], 00:14:45.206 "driver_specific": {} 00:14:45.206 }' 00:14:45.206 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.464 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.464 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:45.464 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.464 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.464 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:45.464 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.464 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.722 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.722 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.722 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.722 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.722 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:45.722 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:45.722 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:45.979 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:45.979 "name": "BaseBdev2", 00:14:45.979 "aliases": [ 00:14:45.979 "02e76c15-68ea-4b67-b86e-93ed9b5cb986" 00:14:45.979 ], 00:14:45.979 "product_name": "Malloc disk", 00:14:45.979 "block_size": 512, 00:14:45.979 "num_blocks": 65536, 00:14:45.979 "uuid": "02e76c15-68ea-4b67-b86e-93ed9b5cb986", 00:14:45.979 "assigned_rate_limits": { 00:14:45.979 "rw_ios_per_sec": 0, 00:14:45.979 "rw_mbytes_per_sec": 0, 00:14:45.979 "r_mbytes_per_sec": 0, 00:14:45.979 "w_mbytes_per_sec": 0 00:14:45.979 }, 00:14:45.979 "claimed": true, 00:14:45.979 "claim_type": "exclusive_write", 00:14:45.979 "zoned": false, 00:14:45.979 "supported_io_types": { 00:14:45.979 "read": true, 00:14:45.979 "write": true, 00:14:45.979 "unmap": true, 00:14:45.979 "flush": true, 00:14:45.979 "reset": true, 00:14:45.979 "nvme_admin": false, 00:14:45.979 "nvme_io": false, 00:14:45.979 "nvme_io_md": false, 00:14:45.979 "write_zeroes": true, 00:14:45.979 "zcopy": true, 00:14:45.979 "get_zone_info": false, 00:14:45.979 "zone_management": false, 00:14:45.979 "zone_append": false, 00:14:45.979 "compare": false, 00:14:45.979 "compare_and_write": false, 00:14:45.979 "abort": true, 00:14:45.979 "seek_hole": false, 00:14:45.979 "seek_data": false, 00:14:45.979 "copy": true, 00:14:45.979 "nvme_iov_md": false 00:14:45.979 }, 00:14:45.979 "memory_domains": [ 00:14:45.979 { 00:14:45.980 "dma_device_id": "system", 00:14:45.980 "dma_device_type": 1 00:14:45.980 }, 00:14:45.980 { 00:14:45.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.980 "dma_device_type": 2 00:14:45.980 } 00:14:45.980 ], 00:14:45.980 "driver_specific": {} 00:14:45.980 }' 00:14:45.980 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.980 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.980 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:45.980 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.980 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:46.237 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:46.237 00:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:46.237 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:46.237 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:46.237 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:46.237 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:46.495 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:46.495 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:46.495 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:46.495 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:46.495 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:46.495 "name": "BaseBdev3", 00:14:46.495 "aliases": [ 00:14:46.495 "cbfd2c6c-295b-42e6-8e4c-531fd549b13e" 00:14:46.495 ], 00:14:46.495 "product_name": "Malloc disk", 00:14:46.495 "block_size": 512, 00:14:46.495 "num_blocks": 65536, 00:14:46.495 "uuid": "cbfd2c6c-295b-42e6-8e4c-531fd549b13e", 00:14:46.495 "assigned_rate_limits": { 00:14:46.495 "rw_ios_per_sec": 0, 00:14:46.495 "rw_mbytes_per_sec": 0, 00:14:46.495 "r_mbytes_per_sec": 0, 00:14:46.495 "w_mbytes_per_sec": 0 00:14:46.495 }, 00:14:46.495 "claimed": true, 00:14:46.495 "claim_type": "exclusive_write", 00:14:46.495 "zoned": false, 00:14:46.495 "supported_io_types": { 00:14:46.495 "read": true, 00:14:46.495 "write": true, 00:14:46.495 "unmap": true, 00:14:46.495 "flush": true, 00:14:46.495 "reset": true, 00:14:46.495 "nvme_admin": false, 00:14:46.495 "nvme_io": false, 00:14:46.495 "nvme_io_md": false, 00:14:46.495 "write_zeroes": true, 00:14:46.495 "zcopy": true, 00:14:46.495 "get_zone_info": false, 00:14:46.495 "zone_management": false, 00:14:46.495 "zone_append": false, 00:14:46.495 "compare": false, 00:14:46.495 "compare_and_write": false, 00:14:46.495 "abort": true, 00:14:46.495 "seek_hole": false, 00:14:46.495 "seek_data": false, 00:14:46.495 "copy": true, 00:14:46.495 "nvme_iov_md": false 00:14:46.495 }, 00:14:46.495 "memory_domains": [ 00:14:46.495 { 00:14:46.495 "dma_device_id": "system", 00:14:46.495 "dma_device_type": 1 00:14:46.495 }, 00:14:46.495 { 00:14:46.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.495 "dma_device_type": 2 00:14:46.495 } 00:14:46.495 ], 00:14:46.495 "driver_specific": {} 00:14:46.495 }' 00:14:46.495 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:46.495 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:46.752 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:46.752 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:46.752 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:46.752 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:46.752 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:46.752 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:46.752 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:46.752 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:46.752 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:47.011 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:47.011 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:47.011 [2024-07-16 00:09:33.950123] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:47.011 [2024-07-16 00:09:33.950149] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:47.011 [2024-07-16 00:09:33.950188] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.270 00:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.528 00:09:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.528 "name": "Existed_Raid", 00:14:47.528 "uuid": "e0e30402-cd83-4915-a51b-afa369bee9c6", 00:14:47.529 "strip_size_kb": 64, 00:14:47.529 "state": "offline", 00:14:47.529 "raid_level": "concat", 00:14:47.529 "superblock": false, 00:14:47.529 "num_base_bdevs": 3, 00:14:47.529 "num_base_bdevs_discovered": 2, 00:14:47.529 "num_base_bdevs_operational": 2, 00:14:47.529 "base_bdevs_list": [ 00:14:47.529 { 00:14:47.529 "name": null, 00:14:47.529 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.529 "is_configured": false, 00:14:47.529 "data_offset": 0, 00:14:47.529 "data_size": 65536 00:14:47.529 }, 00:14:47.529 { 00:14:47.529 "name": "BaseBdev2", 00:14:47.529 "uuid": "02e76c15-68ea-4b67-b86e-93ed9b5cb986", 00:14:47.529 "is_configured": true, 00:14:47.529 "data_offset": 0, 00:14:47.529 "data_size": 65536 00:14:47.529 }, 00:14:47.529 { 00:14:47.529 "name": "BaseBdev3", 00:14:47.529 "uuid": "cbfd2c6c-295b-42e6-8e4c-531fd549b13e", 00:14:47.529 "is_configured": true, 00:14:47.529 "data_offset": 0, 00:14:47.529 "data_size": 65536 00:14:47.529 } 00:14:47.529 ] 00:14:47.529 }' 00:14:47.529 00:09:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.529 00:09:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.096 00:09:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:48.096 00:09:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:48.096 00:09:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.096 00:09:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:48.356 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:48.356 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:48.356 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:48.614 [2024-07-16 00:09:35.331696] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:48.614 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:48.614 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:48.614 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.614 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:48.873 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:48.873 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:48.873 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:49.132 [2024-07-16 00:09:35.832825] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:49.132 [2024-07-16 00:09:35.832863] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21aa400 name Existed_Raid, state offline 00:14:49.132 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:49.132 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:49.132 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.132 00:09:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:49.392 00:09:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:49.392 00:09:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:49.392 00:09:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:49.392 00:09:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:49.392 00:09:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:49.392 00:09:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:49.392 BaseBdev2 00:14:49.651 00:09:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:49.651 00:09:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:49.651 00:09:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:49.651 00:09:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:49.651 00:09:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:49.651 00:09:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:49.651 00:09:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:49.651 00:09:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:49.911 [ 00:14:49.911 { 00:14:49.911 "name": "BaseBdev2", 00:14:49.911 "aliases": [ 00:14:49.911 "c2f940b1-3ec9-4ef3-8660-0f88f3845509" 00:14:49.911 ], 00:14:49.911 "product_name": "Malloc disk", 00:14:49.911 "block_size": 512, 00:14:49.911 "num_blocks": 65536, 00:14:49.911 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:14:49.911 "assigned_rate_limits": { 00:14:49.911 "rw_ios_per_sec": 0, 00:14:49.911 "rw_mbytes_per_sec": 0, 00:14:49.911 "r_mbytes_per_sec": 0, 00:14:49.911 "w_mbytes_per_sec": 0 00:14:49.911 }, 00:14:49.911 "claimed": false, 00:14:49.911 "zoned": false, 00:14:49.911 "supported_io_types": { 00:14:49.911 "read": true, 00:14:49.911 "write": true, 00:14:49.911 "unmap": true, 00:14:49.911 "flush": true, 00:14:49.911 "reset": true, 00:14:49.911 "nvme_admin": false, 00:14:49.911 "nvme_io": false, 00:14:49.911 "nvme_io_md": false, 00:14:49.911 "write_zeroes": true, 00:14:49.911 "zcopy": true, 00:14:49.911 "get_zone_info": false, 00:14:49.911 "zone_management": false, 00:14:49.911 "zone_append": false, 00:14:49.911 "compare": false, 00:14:49.911 "compare_and_write": false, 00:14:49.911 "abort": true, 00:14:49.911 "seek_hole": false, 00:14:49.911 "seek_data": false, 00:14:49.911 "copy": true, 00:14:49.911 "nvme_iov_md": false 00:14:49.911 }, 00:14:49.911 "memory_domains": [ 00:14:49.911 { 00:14:49.911 "dma_device_id": "system", 00:14:49.911 "dma_device_type": 1 00:14:49.911 }, 00:14:49.911 { 00:14:49.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.911 "dma_device_type": 2 00:14:49.911 } 00:14:49.911 ], 00:14:49.911 "driver_specific": {} 00:14:49.911 } 00:14:49.911 ] 00:14:49.911 00:09:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:49.911 00:09:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:49.911 00:09:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:49.911 00:09:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:50.173 BaseBdev3 00:14:50.173 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:50.173 00:09:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:50.173 00:09:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:50.173 00:09:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:50.173 00:09:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:50.173 00:09:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:50.173 00:09:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:50.432 00:09:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:50.699 [ 00:14:50.699 { 00:14:50.699 "name": "BaseBdev3", 00:14:50.699 "aliases": [ 00:14:50.699 "38983913-fe56-457d-8b75-46ca38d46e41" 00:14:50.699 ], 00:14:50.699 "product_name": "Malloc disk", 00:14:50.699 "block_size": 512, 00:14:50.699 "num_blocks": 65536, 00:14:50.700 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:14:50.700 "assigned_rate_limits": { 00:14:50.700 "rw_ios_per_sec": 0, 00:14:50.700 "rw_mbytes_per_sec": 0, 00:14:50.700 "r_mbytes_per_sec": 0, 00:14:50.700 "w_mbytes_per_sec": 0 00:14:50.700 }, 00:14:50.700 "claimed": false, 00:14:50.700 "zoned": false, 00:14:50.700 "supported_io_types": { 00:14:50.700 "read": true, 00:14:50.700 "write": true, 00:14:50.700 "unmap": true, 00:14:50.700 "flush": true, 00:14:50.700 "reset": true, 00:14:50.700 "nvme_admin": false, 00:14:50.700 "nvme_io": false, 00:14:50.700 "nvme_io_md": false, 00:14:50.700 "write_zeroes": true, 00:14:50.700 "zcopy": true, 00:14:50.700 "get_zone_info": false, 00:14:50.700 "zone_management": false, 00:14:50.700 "zone_append": false, 00:14:50.700 "compare": false, 00:14:50.700 "compare_and_write": false, 00:14:50.700 "abort": true, 00:14:50.700 "seek_hole": false, 00:14:50.700 "seek_data": false, 00:14:50.700 "copy": true, 00:14:50.700 "nvme_iov_md": false 00:14:50.700 }, 00:14:50.700 "memory_domains": [ 00:14:50.700 { 00:14:50.700 "dma_device_id": "system", 00:14:50.700 "dma_device_type": 1 00:14:50.700 }, 00:14:50.700 { 00:14:50.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.700 "dma_device_type": 2 00:14:50.700 } 00:14:50.700 ], 00:14:50.700 "driver_specific": {} 00:14:50.700 } 00:14:50.700 ] 00:14:50.700 00:09:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:50.700 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:50.700 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:50.700 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:50.959 [2024-07-16 00:09:37.787710] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:50.959 [2024-07-16 00:09:37.787752] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:50.959 [2024-07-16 00:09:37.787770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:50.959 [2024-07-16 00:09:37.789139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.959 00:09:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.218 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.218 "name": "Existed_Raid", 00:14:51.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.218 "strip_size_kb": 64, 00:14:51.218 "state": "configuring", 00:14:51.218 "raid_level": "concat", 00:14:51.218 "superblock": false, 00:14:51.218 "num_base_bdevs": 3, 00:14:51.218 "num_base_bdevs_discovered": 2, 00:14:51.218 "num_base_bdevs_operational": 3, 00:14:51.218 "base_bdevs_list": [ 00:14:51.218 { 00:14:51.218 "name": "BaseBdev1", 00:14:51.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.218 "is_configured": false, 00:14:51.218 "data_offset": 0, 00:14:51.218 "data_size": 0 00:14:51.218 }, 00:14:51.218 { 00:14:51.218 "name": "BaseBdev2", 00:14:51.218 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:14:51.218 "is_configured": true, 00:14:51.218 "data_offset": 0, 00:14:51.218 "data_size": 65536 00:14:51.218 }, 00:14:51.218 { 00:14:51.218 "name": "BaseBdev3", 00:14:51.218 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:14:51.218 "is_configured": true, 00:14:51.218 "data_offset": 0, 00:14:51.218 "data_size": 65536 00:14:51.218 } 00:14:51.218 ] 00:14:51.218 }' 00:14:51.218 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.218 00:09:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.786 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:52.045 [2024-07-16 00:09:38.886600] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.046 00:09:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.305 00:09:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.305 "name": "Existed_Raid", 00:14:52.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.305 "strip_size_kb": 64, 00:14:52.305 "state": "configuring", 00:14:52.305 "raid_level": "concat", 00:14:52.305 "superblock": false, 00:14:52.305 "num_base_bdevs": 3, 00:14:52.305 "num_base_bdevs_discovered": 1, 00:14:52.305 "num_base_bdevs_operational": 3, 00:14:52.305 "base_bdevs_list": [ 00:14:52.305 { 00:14:52.305 "name": "BaseBdev1", 00:14:52.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.305 "is_configured": false, 00:14:52.305 "data_offset": 0, 00:14:52.305 "data_size": 0 00:14:52.305 }, 00:14:52.305 { 00:14:52.305 "name": null, 00:14:52.305 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:14:52.305 "is_configured": false, 00:14:52.305 "data_offset": 0, 00:14:52.305 "data_size": 65536 00:14:52.305 }, 00:14:52.305 { 00:14:52.305 "name": "BaseBdev3", 00:14:52.305 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:14:52.305 "is_configured": true, 00:14:52.305 "data_offset": 0, 00:14:52.305 "data_size": 65536 00:14:52.305 } 00:14:52.305 ] 00:14:52.305 }' 00:14:52.305 00:09:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.305 00:09:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.873 00:09:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.873 00:09:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:53.132 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:53.132 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:53.391 [2024-07-16 00:09:40.237634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:53.391 BaseBdev1 00:14:53.391 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:53.391 00:09:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:53.391 00:09:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:53.391 00:09:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:53.391 00:09:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:53.391 00:09:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:53.391 00:09:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:53.649 00:09:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:53.909 [ 00:14:53.909 { 00:14:53.909 "name": "BaseBdev1", 00:14:53.909 "aliases": [ 00:14:53.909 "f9074a64-943b-4632-8302-6529008dbec3" 00:14:53.909 ], 00:14:53.909 "product_name": "Malloc disk", 00:14:53.909 "block_size": 512, 00:14:53.909 "num_blocks": 65536, 00:14:53.909 "uuid": "f9074a64-943b-4632-8302-6529008dbec3", 00:14:53.909 "assigned_rate_limits": { 00:14:53.909 "rw_ios_per_sec": 0, 00:14:53.909 "rw_mbytes_per_sec": 0, 00:14:53.909 "r_mbytes_per_sec": 0, 00:14:53.909 "w_mbytes_per_sec": 0 00:14:53.909 }, 00:14:53.909 "claimed": true, 00:14:53.909 "claim_type": "exclusive_write", 00:14:53.909 "zoned": false, 00:14:53.909 "supported_io_types": { 00:14:53.909 "read": true, 00:14:53.909 "write": true, 00:14:53.909 "unmap": true, 00:14:53.909 "flush": true, 00:14:53.909 "reset": true, 00:14:53.909 "nvme_admin": false, 00:14:53.909 "nvme_io": false, 00:14:53.909 "nvme_io_md": false, 00:14:53.909 "write_zeroes": true, 00:14:53.909 "zcopy": true, 00:14:53.909 "get_zone_info": false, 00:14:53.909 "zone_management": false, 00:14:53.909 "zone_append": false, 00:14:53.909 "compare": false, 00:14:53.909 "compare_and_write": false, 00:14:53.909 "abort": true, 00:14:53.909 "seek_hole": false, 00:14:53.909 "seek_data": false, 00:14:53.909 "copy": true, 00:14:53.909 "nvme_iov_md": false 00:14:53.909 }, 00:14:53.909 "memory_domains": [ 00:14:53.909 { 00:14:53.909 "dma_device_id": "system", 00:14:53.909 "dma_device_type": 1 00:14:53.909 }, 00:14:53.909 { 00:14:53.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:53.909 "dma_device_type": 2 00:14:53.909 } 00:14:53.909 ], 00:14:53.909 "driver_specific": {} 00:14:53.909 } 00:14:53.909 ] 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.909 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:54.169 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.169 "name": "Existed_Raid", 00:14:54.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.169 "strip_size_kb": 64, 00:14:54.169 "state": "configuring", 00:14:54.169 "raid_level": "concat", 00:14:54.169 "superblock": false, 00:14:54.169 "num_base_bdevs": 3, 00:14:54.169 "num_base_bdevs_discovered": 2, 00:14:54.169 "num_base_bdevs_operational": 3, 00:14:54.169 "base_bdevs_list": [ 00:14:54.169 { 00:14:54.169 "name": "BaseBdev1", 00:14:54.169 "uuid": "f9074a64-943b-4632-8302-6529008dbec3", 00:14:54.169 "is_configured": true, 00:14:54.169 "data_offset": 0, 00:14:54.169 "data_size": 65536 00:14:54.169 }, 00:14:54.169 { 00:14:54.169 "name": null, 00:14:54.169 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:14:54.169 "is_configured": false, 00:14:54.169 "data_offset": 0, 00:14:54.169 "data_size": 65536 00:14:54.169 }, 00:14:54.169 { 00:14:54.169 "name": "BaseBdev3", 00:14:54.169 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:14:54.169 "is_configured": true, 00:14:54.169 "data_offset": 0, 00:14:54.169 "data_size": 65536 00:14:54.169 } 00:14:54.169 ] 00:14:54.169 }' 00:14:54.169 00:09:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.169 00:09:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.738 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.738 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:54.738 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:54.738 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:54.997 [2024-07-16 00:09:41.886045] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.997 00:09:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:55.256 00:09:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.256 "name": "Existed_Raid", 00:14:55.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.256 "strip_size_kb": 64, 00:14:55.256 "state": "configuring", 00:14:55.256 "raid_level": "concat", 00:14:55.256 "superblock": false, 00:14:55.256 "num_base_bdevs": 3, 00:14:55.256 "num_base_bdevs_discovered": 1, 00:14:55.256 "num_base_bdevs_operational": 3, 00:14:55.256 "base_bdevs_list": [ 00:14:55.256 { 00:14:55.256 "name": "BaseBdev1", 00:14:55.256 "uuid": "f9074a64-943b-4632-8302-6529008dbec3", 00:14:55.256 "is_configured": true, 00:14:55.256 "data_offset": 0, 00:14:55.256 "data_size": 65536 00:14:55.256 }, 00:14:55.256 { 00:14:55.256 "name": null, 00:14:55.256 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:14:55.256 "is_configured": false, 00:14:55.256 "data_offset": 0, 00:14:55.256 "data_size": 65536 00:14:55.256 }, 00:14:55.256 { 00:14:55.256 "name": null, 00:14:55.257 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:14:55.257 "is_configured": false, 00:14:55.257 "data_offset": 0, 00:14:55.257 "data_size": 65536 00:14:55.257 } 00:14:55.257 ] 00:14:55.257 }' 00:14:55.257 00:09:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.257 00:09:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.894 00:09:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.894 00:09:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:56.152 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:56.152 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:56.412 [2024-07-16 00:09:43.245695] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.412 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.671 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.671 "name": "Existed_Raid", 00:14:56.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.671 "strip_size_kb": 64, 00:14:56.671 "state": "configuring", 00:14:56.671 "raid_level": "concat", 00:14:56.671 "superblock": false, 00:14:56.671 "num_base_bdevs": 3, 00:14:56.671 "num_base_bdevs_discovered": 2, 00:14:56.671 "num_base_bdevs_operational": 3, 00:14:56.671 "base_bdevs_list": [ 00:14:56.671 { 00:14:56.671 "name": "BaseBdev1", 00:14:56.671 "uuid": "f9074a64-943b-4632-8302-6529008dbec3", 00:14:56.671 "is_configured": true, 00:14:56.671 "data_offset": 0, 00:14:56.671 "data_size": 65536 00:14:56.671 }, 00:14:56.671 { 00:14:56.671 "name": null, 00:14:56.671 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:14:56.671 "is_configured": false, 00:14:56.671 "data_offset": 0, 00:14:56.671 "data_size": 65536 00:14:56.671 }, 00:14:56.671 { 00:14:56.671 "name": "BaseBdev3", 00:14:56.671 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:14:56.671 "is_configured": true, 00:14:56.671 "data_offset": 0, 00:14:56.671 "data_size": 65536 00:14:56.671 } 00:14:56.671 ] 00:14:56.671 }' 00:14:56.671 00:09:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.671 00:09:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.236 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.236 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:57.494 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:57.494 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:57.753 [2024-07-16 00:09:44.581249] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.753 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:58.011 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:58.011 "name": "Existed_Raid", 00:14:58.011 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.011 "strip_size_kb": 64, 00:14:58.011 "state": "configuring", 00:14:58.011 "raid_level": "concat", 00:14:58.011 "superblock": false, 00:14:58.011 "num_base_bdevs": 3, 00:14:58.011 "num_base_bdevs_discovered": 1, 00:14:58.011 "num_base_bdevs_operational": 3, 00:14:58.011 "base_bdevs_list": [ 00:14:58.011 { 00:14:58.011 "name": null, 00:14:58.011 "uuid": "f9074a64-943b-4632-8302-6529008dbec3", 00:14:58.011 "is_configured": false, 00:14:58.011 "data_offset": 0, 00:14:58.011 "data_size": 65536 00:14:58.011 }, 00:14:58.011 { 00:14:58.011 "name": null, 00:14:58.011 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:14:58.011 "is_configured": false, 00:14:58.011 "data_offset": 0, 00:14:58.011 "data_size": 65536 00:14:58.011 }, 00:14:58.011 { 00:14:58.011 "name": "BaseBdev3", 00:14:58.011 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:14:58.011 "is_configured": true, 00:14:58.011 "data_offset": 0, 00:14:58.011 "data_size": 65536 00:14:58.011 } 00:14:58.011 ] 00:14:58.011 }' 00:14:58.011 00:09:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:58.011 00:09:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.945 00:09:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.946 00:09:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:58.946 00:09:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:58.946 00:09:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:59.204 [2024-07-16 00:09:45.997496] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.204 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.771 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.771 "name": "Existed_Raid", 00:14:59.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.771 "strip_size_kb": 64, 00:14:59.771 "state": "configuring", 00:14:59.771 "raid_level": "concat", 00:14:59.771 "superblock": false, 00:14:59.771 "num_base_bdevs": 3, 00:14:59.771 "num_base_bdevs_discovered": 2, 00:14:59.771 "num_base_bdevs_operational": 3, 00:14:59.771 "base_bdevs_list": [ 00:14:59.771 { 00:14:59.771 "name": null, 00:14:59.771 "uuid": "f9074a64-943b-4632-8302-6529008dbec3", 00:14:59.771 "is_configured": false, 00:14:59.771 "data_offset": 0, 00:14:59.771 "data_size": 65536 00:14:59.771 }, 00:14:59.771 { 00:14:59.771 "name": "BaseBdev2", 00:14:59.771 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:14:59.771 "is_configured": true, 00:14:59.771 "data_offset": 0, 00:14:59.771 "data_size": 65536 00:14:59.771 }, 00:14:59.771 { 00:14:59.771 "name": "BaseBdev3", 00:14:59.771 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:14:59.771 "is_configured": true, 00:14:59.771 "data_offset": 0, 00:14:59.771 "data_size": 65536 00:14:59.771 } 00:14:59.771 ] 00:14:59.771 }' 00:14:59.771 00:09:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.771 00:09:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.337 00:09:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:00.337 00:09:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.595 00:09:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:00.595 00:09:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.595 00:09:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:00.852 00:09:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f9074a64-943b-4632-8302-6529008dbec3 00:15:01.110 [2024-07-16 00:09:47.846943] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:01.110 [2024-07-16 00:09:47.846978] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21a8450 00:15:01.110 [2024-07-16 00:09:47.846986] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:01.110 [2024-07-16 00:09:47.847179] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a9ed0 00:15:01.110 [2024-07-16 00:09:47.847292] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21a8450 00:15:01.110 [2024-07-16 00:09:47.847301] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21a8450 00:15:01.110 [2024-07-16 00:09:47.847464] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:01.110 NewBaseBdev 00:15:01.110 00:09:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:01.110 00:09:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:01.110 00:09:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:01.110 00:09:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:01.110 00:09:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:01.110 00:09:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:01.110 00:09:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:01.368 00:09:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:01.626 [ 00:15:01.626 { 00:15:01.626 "name": "NewBaseBdev", 00:15:01.626 "aliases": [ 00:15:01.626 "f9074a64-943b-4632-8302-6529008dbec3" 00:15:01.626 ], 00:15:01.626 "product_name": "Malloc disk", 00:15:01.626 "block_size": 512, 00:15:01.626 "num_blocks": 65536, 00:15:01.626 "uuid": "f9074a64-943b-4632-8302-6529008dbec3", 00:15:01.626 "assigned_rate_limits": { 00:15:01.626 "rw_ios_per_sec": 0, 00:15:01.626 "rw_mbytes_per_sec": 0, 00:15:01.626 "r_mbytes_per_sec": 0, 00:15:01.626 "w_mbytes_per_sec": 0 00:15:01.626 }, 00:15:01.626 "claimed": true, 00:15:01.626 "claim_type": "exclusive_write", 00:15:01.626 "zoned": false, 00:15:01.626 "supported_io_types": { 00:15:01.626 "read": true, 00:15:01.626 "write": true, 00:15:01.626 "unmap": true, 00:15:01.626 "flush": true, 00:15:01.626 "reset": true, 00:15:01.626 "nvme_admin": false, 00:15:01.626 "nvme_io": false, 00:15:01.626 "nvme_io_md": false, 00:15:01.626 "write_zeroes": true, 00:15:01.626 "zcopy": true, 00:15:01.626 "get_zone_info": false, 00:15:01.626 "zone_management": false, 00:15:01.626 "zone_append": false, 00:15:01.626 "compare": false, 00:15:01.626 "compare_and_write": false, 00:15:01.626 "abort": true, 00:15:01.626 "seek_hole": false, 00:15:01.626 "seek_data": false, 00:15:01.626 "copy": true, 00:15:01.626 "nvme_iov_md": false 00:15:01.626 }, 00:15:01.626 "memory_domains": [ 00:15:01.626 { 00:15:01.626 "dma_device_id": "system", 00:15:01.626 "dma_device_type": 1 00:15:01.627 }, 00:15:01.627 { 00:15:01.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.627 "dma_device_type": 2 00:15:01.627 } 00:15:01.627 ], 00:15:01.627 "driver_specific": {} 00:15:01.627 } 00:15:01.627 ] 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.627 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:01.885 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.885 "name": "Existed_Raid", 00:15:01.885 "uuid": "dbdf92b9-e1e7-4c5f-b1ac-afec9a2ec3bf", 00:15:01.885 "strip_size_kb": 64, 00:15:01.885 "state": "online", 00:15:01.885 "raid_level": "concat", 00:15:01.885 "superblock": false, 00:15:01.885 "num_base_bdevs": 3, 00:15:01.885 "num_base_bdevs_discovered": 3, 00:15:01.885 "num_base_bdevs_operational": 3, 00:15:01.885 "base_bdevs_list": [ 00:15:01.885 { 00:15:01.885 "name": "NewBaseBdev", 00:15:01.885 "uuid": "f9074a64-943b-4632-8302-6529008dbec3", 00:15:01.885 "is_configured": true, 00:15:01.885 "data_offset": 0, 00:15:01.885 "data_size": 65536 00:15:01.885 }, 00:15:01.885 { 00:15:01.885 "name": "BaseBdev2", 00:15:01.885 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:15:01.885 "is_configured": true, 00:15:01.885 "data_offset": 0, 00:15:01.885 "data_size": 65536 00:15:01.885 }, 00:15:01.885 { 00:15:01.885 "name": "BaseBdev3", 00:15:01.885 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:15:01.885 "is_configured": true, 00:15:01.885 "data_offset": 0, 00:15:01.885 "data_size": 65536 00:15:01.885 } 00:15:01.885 ] 00:15:01.885 }' 00:15:01.885 00:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.885 00:09:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.452 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:02.452 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:02.452 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:02.452 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:02.452 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:02.452 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:02.452 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:02.452 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:02.711 [2024-07-16 00:09:49.447478] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:02.711 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:02.711 "name": "Existed_Raid", 00:15:02.711 "aliases": [ 00:15:02.711 "dbdf92b9-e1e7-4c5f-b1ac-afec9a2ec3bf" 00:15:02.711 ], 00:15:02.711 "product_name": "Raid Volume", 00:15:02.711 "block_size": 512, 00:15:02.711 "num_blocks": 196608, 00:15:02.711 "uuid": "dbdf92b9-e1e7-4c5f-b1ac-afec9a2ec3bf", 00:15:02.711 "assigned_rate_limits": { 00:15:02.711 "rw_ios_per_sec": 0, 00:15:02.711 "rw_mbytes_per_sec": 0, 00:15:02.711 "r_mbytes_per_sec": 0, 00:15:02.711 "w_mbytes_per_sec": 0 00:15:02.711 }, 00:15:02.711 "claimed": false, 00:15:02.711 "zoned": false, 00:15:02.711 "supported_io_types": { 00:15:02.711 "read": true, 00:15:02.711 "write": true, 00:15:02.711 "unmap": true, 00:15:02.711 "flush": true, 00:15:02.711 "reset": true, 00:15:02.711 "nvme_admin": false, 00:15:02.711 "nvme_io": false, 00:15:02.711 "nvme_io_md": false, 00:15:02.711 "write_zeroes": true, 00:15:02.711 "zcopy": false, 00:15:02.711 "get_zone_info": false, 00:15:02.711 "zone_management": false, 00:15:02.711 "zone_append": false, 00:15:02.711 "compare": false, 00:15:02.711 "compare_and_write": false, 00:15:02.711 "abort": false, 00:15:02.711 "seek_hole": false, 00:15:02.711 "seek_data": false, 00:15:02.711 "copy": false, 00:15:02.711 "nvme_iov_md": false 00:15:02.711 }, 00:15:02.711 "memory_domains": [ 00:15:02.711 { 00:15:02.711 "dma_device_id": "system", 00:15:02.711 "dma_device_type": 1 00:15:02.711 }, 00:15:02.711 { 00:15:02.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.711 "dma_device_type": 2 00:15:02.711 }, 00:15:02.711 { 00:15:02.711 "dma_device_id": "system", 00:15:02.711 "dma_device_type": 1 00:15:02.711 }, 00:15:02.711 { 00:15:02.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.711 "dma_device_type": 2 00:15:02.711 }, 00:15:02.711 { 00:15:02.711 "dma_device_id": "system", 00:15:02.711 "dma_device_type": 1 00:15:02.711 }, 00:15:02.711 { 00:15:02.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.711 "dma_device_type": 2 00:15:02.712 } 00:15:02.712 ], 00:15:02.712 "driver_specific": { 00:15:02.712 "raid": { 00:15:02.712 "uuid": "dbdf92b9-e1e7-4c5f-b1ac-afec9a2ec3bf", 00:15:02.712 "strip_size_kb": 64, 00:15:02.712 "state": "online", 00:15:02.712 "raid_level": "concat", 00:15:02.712 "superblock": false, 00:15:02.712 "num_base_bdevs": 3, 00:15:02.712 "num_base_bdevs_discovered": 3, 00:15:02.712 "num_base_bdevs_operational": 3, 00:15:02.712 "base_bdevs_list": [ 00:15:02.712 { 00:15:02.712 "name": "NewBaseBdev", 00:15:02.712 "uuid": "f9074a64-943b-4632-8302-6529008dbec3", 00:15:02.712 "is_configured": true, 00:15:02.712 "data_offset": 0, 00:15:02.712 "data_size": 65536 00:15:02.712 }, 00:15:02.712 { 00:15:02.712 "name": "BaseBdev2", 00:15:02.712 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:15:02.712 "is_configured": true, 00:15:02.712 "data_offset": 0, 00:15:02.712 "data_size": 65536 00:15:02.712 }, 00:15:02.712 { 00:15:02.712 "name": "BaseBdev3", 00:15:02.712 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:15:02.712 "is_configured": true, 00:15:02.712 "data_offset": 0, 00:15:02.712 "data_size": 65536 00:15:02.712 } 00:15:02.712 ] 00:15:02.712 } 00:15:02.712 } 00:15:02.712 }' 00:15:02.712 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:02.712 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:02.712 BaseBdev2 00:15:02.712 BaseBdev3' 00:15:02.712 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:02.712 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:02.712 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:02.971 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:02.971 "name": "NewBaseBdev", 00:15:02.971 "aliases": [ 00:15:02.971 "f9074a64-943b-4632-8302-6529008dbec3" 00:15:02.971 ], 00:15:02.971 "product_name": "Malloc disk", 00:15:02.971 "block_size": 512, 00:15:02.971 "num_blocks": 65536, 00:15:02.971 "uuid": "f9074a64-943b-4632-8302-6529008dbec3", 00:15:02.971 "assigned_rate_limits": { 00:15:02.971 "rw_ios_per_sec": 0, 00:15:02.971 "rw_mbytes_per_sec": 0, 00:15:02.971 "r_mbytes_per_sec": 0, 00:15:02.971 "w_mbytes_per_sec": 0 00:15:02.971 }, 00:15:02.971 "claimed": true, 00:15:02.971 "claim_type": "exclusive_write", 00:15:02.971 "zoned": false, 00:15:02.971 "supported_io_types": { 00:15:02.971 "read": true, 00:15:02.971 "write": true, 00:15:02.971 "unmap": true, 00:15:02.971 "flush": true, 00:15:02.971 "reset": true, 00:15:02.971 "nvme_admin": false, 00:15:02.971 "nvme_io": false, 00:15:02.971 "nvme_io_md": false, 00:15:02.971 "write_zeroes": true, 00:15:02.971 "zcopy": true, 00:15:02.971 "get_zone_info": false, 00:15:02.971 "zone_management": false, 00:15:02.971 "zone_append": false, 00:15:02.971 "compare": false, 00:15:02.971 "compare_and_write": false, 00:15:02.971 "abort": true, 00:15:02.971 "seek_hole": false, 00:15:02.971 "seek_data": false, 00:15:02.971 "copy": true, 00:15:02.971 "nvme_iov_md": false 00:15:02.971 }, 00:15:02.971 "memory_domains": [ 00:15:02.971 { 00:15:02.971 "dma_device_id": "system", 00:15:02.971 "dma_device_type": 1 00:15:02.971 }, 00:15:02.971 { 00:15:02.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.971 "dma_device_type": 2 00:15:02.971 } 00:15:02.971 ], 00:15:02.971 "driver_specific": {} 00:15:02.971 }' 00:15:02.971 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.971 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.229 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:03.229 00:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.229 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.229 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:03.229 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.229 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.229 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:03.229 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.229 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.487 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:03.487 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:03.487 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:03.487 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:03.746 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:03.746 "name": "BaseBdev2", 00:15:03.746 "aliases": [ 00:15:03.746 "c2f940b1-3ec9-4ef3-8660-0f88f3845509" 00:15:03.746 ], 00:15:03.746 "product_name": "Malloc disk", 00:15:03.746 "block_size": 512, 00:15:03.746 "num_blocks": 65536, 00:15:03.746 "uuid": "c2f940b1-3ec9-4ef3-8660-0f88f3845509", 00:15:03.746 "assigned_rate_limits": { 00:15:03.746 "rw_ios_per_sec": 0, 00:15:03.746 "rw_mbytes_per_sec": 0, 00:15:03.746 "r_mbytes_per_sec": 0, 00:15:03.746 "w_mbytes_per_sec": 0 00:15:03.746 }, 00:15:03.746 "claimed": true, 00:15:03.746 "claim_type": "exclusive_write", 00:15:03.746 "zoned": false, 00:15:03.746 "supported_io_types": { 00:15:03.746 "read": true, 00:15:03.746 "write": true, 00:15:03.746 "unmap": true, 00:15:03.746 "flush": true, 00:15:03.746 "reset": true, 00:15:03.746 "nvme_admin": false, 00:15:03.746 "nvme_io": false, 00:15:03.746 "nvme_io_md": false, 00:15:03.746 "write_zeroes": true, 00:15:03.746 "zcopy": true, 00:15:03.746 "get_zone_info": false, 00:15:03.746 "zone_management": false, 00:15:03.746 "zone_append": false, 00:15:03.746 "compare": false, 00:15:03.746 "compare_and_write": false, 00:15:03.746 "abort": true, 00:15:03.746 "seek_hole": false, 00:15:03.746 "seek_data": false, 00:15:03.746 "copy": true, 00:15:03.746 "nvme_iov_md": false 00:15:03.746 }, 00:15:03.746 "memory_domains": [ 00:15:03.746 { 00:15:03.746 "dma_device_id": "system", 00:15:03.746 "dma_device_type": 1 00:15:03.746 }, 00:15:03.746 { 00:15:03.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.746 "dma_device_type": 2 00:15:03.746 } 00:15:03.746 ], 00:15:03.746 "driver_specific": {} 00:15:03.746 }' 00:15:03.746 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.746 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.746 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:03.746 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.746 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.746 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:03.746 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.746 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.004 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:04.004 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.004 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.004 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:04.004 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.004 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:04.004 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:04.261 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:04.261 "name": "BaseBdev3", 00:15:04.261 "aliases": [ 00:15:04.261 "38983913-fe56-457d-8b75-46ca38d46e41" 00:15:04.261 ], 00:15:04.261 "product_name": "Malloc disk", 00:15:04.261 "block_size": 512, 00:15:04.261 "num_blocks": 65536, 00:15:04.261 "uuid": "38983913-fe56-457d-8b75-46ca38d46e41", 00:15:04.261 "assigned_rate_limits": { 00:15:04.261 "rw_ios_per_sec": 0, 00:15:04.261 "rw_mbytes_per_sec": 0, 00:15:04.261 "r_mbytes_per_sec": 0, 00:15:04.261 "w_mbytes_per_sec": 0 00:15:04.261 }, 00:15:04.261 "claimed": true, 00:15:04.261 "claim_type": "exclusive_write", 00:15:04.261 "zoned": false, 00:15:04.261 "supported_io_types": { 00:15:04.261 "read": true, 00:15:04.261 "write": true, 00:15:04.261 "unmap": true, 00:15:04.261 "flush": true, 00:15:04.261 "reset": true, 00:15:04.261 "nvme_admin": false, 00:15:04.261 "nvme_io": false, 00:15:04.261 "nvme_io_md": false, 00:15:04.261 "write_zeroes": true, 00:15:04.261 "zcopy": true, 00:15:04.261 "get_zone_info": false, 00:15:04.261 "zone_management": false, 00:15:04.261 "zone_append": false, 00:15:04.261 "compare": false, 00:15:04.261 "compare_and_write": false, 00:15:04.261 "abort": true, 00:15:04.261 "seek_hole": false, 00:15:04.261 "seek_data": false, 00:15:04.261 "copy": true, 00:15:04.261 "nvme_iov_md": false 00:15:04.261 }, 00:15:04.261 "memory_domains": [ 00:15:04.261 { 00:15:04.261 "dma_device_id": "system", 00:15:04.261 "dma_device_type": 1 00:15:04.261 }, 00:15:04.261 { 00:15:04.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.262 "dma_device_type": 2 00:15:04.262 } 00:15:04.262 ], 00:15:04.262 "driver_specific": {} 00:15:04.262 }' 00:15:04.262 00:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.262 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.262 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:04.262 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.262 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.262 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:04.262 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.262 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.520 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:04.520 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.520 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.520 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:04.520 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:05.093 [2024-07-16 00:09:51.821544] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:05.093 [2024-07-16 00:09:51.821573] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:05.093 [2024-07-16 00:09:51.821625] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:05.093 [2024-07-16 00:09:51.821671] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:05.093 [2024-07-16 00:09:51.821682] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21a8450 name Existed_Raid, state offline 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3522433 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3522433 ']' 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3522433 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3522433 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3522433' 00:15:05.093 killing process with pid 3522433 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3522433 00:15:05.093 [2024-07-16 00:09:51.901901] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:05.093 00:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3522433 00:15:05.093 [2024-07-16 00:09:51.933368] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:05.353 00:15:05.353 real 0m28.989s 00:15:05.353 user 0m53.193s 00:15:05.353 sys 0m5.132s 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:05.353 ************************************ 00:15:05.353 END TEST raid_state_function_test 00:15:05.353 ************************************ 00:15:05.353 00:09:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:05.353 00:09:52 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:05.353 00:09:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:05.353 00:09:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:05.353 00:09:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:05.353 ************************************ 00:15:05.353 START TEST raid_state_function_test_sb 00:15:05.353 ************************************ 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3526728 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3526728' 00:15:05.353 Process raid pid: 3526728 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3526728 /var/tmp/spdk-raid.sock 00:15:05.353 00:09:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3526728 ']' 00:15:05.354 00:09:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:05.354 00:09:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:05.354 00:09:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:05.354 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:05.354 00:09:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:05.354 00:09:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:05.612 [2024-07-16 00:09:52.312599] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:15:05.612 [2024-07-16 00:09:52.312667] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:05.612 [2024-07-16 00:09:52.441350] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:05.612 [2024-07-16 00:09:52.544922] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:05.871 [2024-07-16 00:09:52.607909] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:05.871 [2024-07-16 00:09:52.607950] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:06.438 00:09:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:06.438 00:09:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:06.438 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:07.002 [2024-07-16 00:09:53.732350] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:07.002 [2024-07-16 00:09:53.732389] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:07.002 [2024-07-16 00:09:53.732400] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:07.002 [2024-07-16 00:09:53.732413] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:07.002 [2024-07-16 00:09:53.732422] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:07.002 [2024-07-16 00:09:53.732434] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.002 00:09:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.261 00:09:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.261 "name": "Existed_Raid", 00:15:07.261 "uuid": "bfdbef5c-73fe-4cd4-9290-94077ad8eda7", 00:15:07.261 "strip_size_kb": 64, 00:15:07.261 "state": "configuring", 00:15:07.261 "raid_level": "concat", 00:15:07.261 "superblock": true, 00:15:07.261 "num_base_bdevs": 3, 00:15:07.261 "num_base_bdevs_discovered": 0, 00:15:07.261 "num_base_bdevs_operational": 3, 00:15:07.261 "base_bdevs_list": [ 00:15:07.261 { 00:15:07.261 "name": "BaseBdev1", 00:15:07.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.261 "is_configured": false, 00:15:07.261 "data_offset": 0, 00:15:07.261 "data_size": 0 00:15:07.261 }, 00:15:07.261 { 00:15:07.261 "name": "BaseBdev2", 00:15:07.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.261 "is_configured": false, 00:15:07.261 "data_offset": 0, 00:15:07.261 "data_size": 0 00:15:07.261 }, 00:15:07.261 { 00:15:07.261 "name": "BaseBdev3", 00:15:07.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.261 "is_configured": false, 00:15:07.261 "data_offset": 0, 00:15:07.261 "data_size": 0 00:15:07.261 } 00:15:07.261 ] 00:15:07.261 }' 00:15:07.261 00:09:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.261 00:09:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:07.826 00:09:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:08.082 [2024-07-16 00:09:54.835106] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:08.082 [2024-07-16 00:09:54.835132] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e09a80 name Existed_Raid, state configuring 00:15:08.082 00:09:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:08.339 [2024-07-16 00:09:55.083790] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:08.339 [2024-07-16 00:09:55.083816] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:08.339 [2024-07-16 00:09:55.083826] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:08.339 [2024-07-16 00:09:55.083837] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:08.339 [2024-07-16 00:09:55.083846] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:08.339 [2024-07-16 00:09:55.083857] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:08.339 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:08.597 [2024-07-16 00:09:55.334317] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:08.597 BaseBdev1 00:15:08.597 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:08.597 00:09:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:08.597 00:09:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:08.597 00:09:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:08.597 00:09:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:08.597 00:09:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:08.597 00:09:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:08.856 00:09:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:09.115 [ 00:15:09.115 { 00:15:09.115 "name": "BaseBdev1", 00:15:09.115 "aliases": [ 00:15:09.115 "90b64f75-41a3-40f2-a51d-5c987896c1d2" 00:15:09.115 ], 00:15:09.115 "product_name": "Malloc disk", 00:15:09.115 "block_size": 512, 00:15:09.115 "num_blocks": 65536, 00:15:09.115 "uuid": "90b64f75-41a3-40f2-a51d-5c987896c1d2", 00:15:09.115 "assigned_rate_limits": { 00:15:09.115 "rw_ios_per_sec": 0, 00:15:09.115 "rw_mbytes_per_sec": 0, 00:15:09.115 "r_mbytes_per_sec": 0, 00:15:09.115 "w_mbytes_per_sec": 0 00:15:09.115 }, 00:15:09.115 "claimed": true, 00:15:09.115 "claim_type": "exclusive_write", 00:15:09.115 "zoned": false, 00:15:09.115 "supported_io_types": { 00:15:09.115 "read": true, 00:15:09.115 "write": true, 00:15:09.115 "unmap": true, 00:15:09.115 "flush": true, 00:15:09.115 "reset": true, 00:15:09.115 "nvme_admin": false, 00:15:09.115 "nvme_io": false, 00:15:09.115 "nvme_io_md": false, 00:15:09.115 "write_zeroes": true, 00:15:09.115 "zcopy": true, 00:15:09.115 "get_zone_info": false, 00:15:09.115 "zone_management": false, 00:15:09.115 "zone_append": false, 00:15:09.115 "compare": false, 00:15:09.115 "compare_and_write": false, 00:15:09.115 "abort": true, 00:15:09.115 "seek_hole": false, 00:15:09.115 "seek_data": false, 00:15:09.115 "copy": true, 00:15:09.115 "nvme_iov_md": false 00:15:09.115 }, 00:15:09.115 "memory_domains": [ 00:15:09.115 { 00:15:09.115 "dma_device_id": "system", 00:15:09.115 "dma_device_type": 1 00:15:09.115 }, 00:15:09.115 { 00:15:09.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.115 "dma_device_type": 2 00:15:09.115 } 00:15:09.115 ], 00:15:09.115 "driver_specific": {} 00:15:09.115 } 00:15:09.115 ] 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.115 00:09:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:09.115 00:09:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.115 "name": "Existed_Raid", 00:15:09.115 "uuid": "d0a7b830-07a5-411f-85cb-7cb14b61e046", 00:15:09.115 "strip_size_kb": 64, 00:15:09.115 "state": "configuring", 00:15:09.115 "raid_level": "concat", 00:15:09.115 "superblock": true, 00:15:09.115 "num_base_bdevs": 3, 00:15:09.115 "num_base_bdevs_discovered": 1, 00:15:09.115 "num_base_bdevs_operational": 3, 00:15:09.115 "base_bdevs_list": [ 00:15:09.115 { 00:15:09.115 "name": "BaseBdev1", 00:15:09.115 "uuid": "90b64f75-41a3-40f2-a51d-5c987896c1d2", 00:15:09.115 "is_configured": true, 00:15:09.115 "data_offset": 2048, 00:15:09.115 "data_size": 63488 00:15:09.115 }, 00:15:09.115 { 00:15:09.115 "name": "BaseBdev2", 00:15:09.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.115 "is_configured": false, 00:15:09.115 "data_offset": 0, 00:15:09.115 "data_size": 0 00:15:09.115 }, 00:15:09.115 { 00:15:09.115 "name": "BaseBdev3", 00:15:09.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.115 "is_configured": false, 00:15:09.115 "data_offset": 0, 00:15:09.115 "data_size": 0 00:15:09.115 } 00:15:09.115 ] 00:15:09.115 }' 00:15:09.115 00:09:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.115 00:09:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:10.052 00:09:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:10.052 [2024-07-16 00:09:56.858343] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:10.052 [2024-07-16 00:09:56.858381] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e09310 name Existed_Raid, state configuring 00:15:10.052 00:09:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:10.312 [2024-07-16 00:09:57.099029] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:10.312 [2024-07-16 00:09:57.100466] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:10.312 [2024-07-16 00:09:57.100500] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:10.312 [2024-07-16 00:09:57.100510] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:10.312 [2024-07-16 00:09:57.100522] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.312 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.572 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.572 "name": "Existed_Raid", 00:15:10.572 "uuid": "bc5f9353-6f10-493e-9d77-a9b83d44449b", 00:15:10.572 "strip_size_kb": 64, 00:15:10.572 "state": "configuring", 00:15:10.572 "raid_level": "concat", 00:15:10.572 "superblock": true, 00:15:10.572 "num_base_bdevs": 3, 00:15:10.572 "num_base_bdevs_discovered": 1, 00:15:10.572 "num_base_bdevs_operational": 3, 00:15:10.572 "base_bdevs_list": [ 00:15:10.572 { 00:15:10.572 "name": "BaseBdev1", 00:15:10.572 "uuid": "90b64f75-41a3-40f2-a51d-5c987896c1d2", 00:15:10.572 "is_configured": true, 00:15:10.572 "data_offset": 2048, 00:15:10.572 "data_size": 63488 00:15:10.572 }, 00:15:10.572 { 00:15:10.572 "name": "BaseBdev2", 00:15:10.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.572 "is_configured": false, 00:15:10.572 "data_offset": 0, 00:15:10.572 "data_size": 0 00:15:10.572 }, 00:15:10.572 { 00:15:10.572 "name": "BaseBdev3", 00:15:10.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.572 "is_configured": false, 00:15:10.572 "data_offset": 0, 00:15:10.572 "data_size": 0 00:15:10.572 } 00:15:10.572 ] 00:15:10.572 }' 00:15:10.572 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.572 00:09:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.140 00:09:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:11.399 [2024-07-16 00:09:58.177324] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:11.399 BaseBdev2 00:15:11.399 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:11.399 00:09:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:11.399 00:09:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:11.399 00:09:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:11.399 00:09:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:11.399 00:09:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:11.400 00:09:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:11.658 00:09:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:11.917 [ 00:15:11.917 { 00:15:11.917 "name": "BaseBdev2", 00:15:11.917 "aliases": [ 00:15:11.917 "b16de74b-4314-465b-a099-73ebac0adc37" 00:15:11.917 ], 00:15:11.917 "product_name": "Malloc disk", 00:15:11.917 "block_size": 512, 00:15:11.917 "num_blocks": 65536, 00:15:11.917 "uuid": "b16de74b-4314-465b-a099-73ebac0adc37", 00:15:11.917 "assigned_rate_limits": { 00:15:11.917 "rw_ios_per_sec": 0, 00:15:11.917 "rw_mbytes_per_sec": 0, 00:15:11.917 "r_mbytes_per_sec": 0, 00:15:11.917 "w_mbytes_per_sec": 0 00:15:11.917 }, 00:15:11.917 "claimed": true, 00:15:11.917 "claim_type": "exclusive_write", 00:15:11.917 "zoned": false, 00:15:11.917 "supported_io_types": { 00:15:11.917 "read": true, 00:15:11.917 "write": true, 00:15:11.917 "unmap": true, 00:15:11.917 "flush": true, 00:15:11.917 "reset": true, 00:15:11.917 "nvme_admin": false, 00:15:11.917 "nvme_io": false, 00:15:11.917 "nvme_io_md": false, 00:15:11.917 "write_zeroes": true, 00:15:11.917 "zcopy": true, 00:15:11.917 "get_zone_info": false, 00:15:11.917 "zone_management": false, 00:15:11.917 "zone_append": false, 00:15:11.917 "compare": false, 00:15:11.917 "compare_and_write": false, 00:15:11.918 "abort": true, 00:15:11.918 "seek_hole": false, 00:15:11.918 "seek_data": false, 00:15:11.918 "copy": true, 00:15:11.918 "nvme_iov_md": false 00:15:11.918 }, 00:15:11.918 "memory_domains": [ 00:15:11.918 { 00:15:11.918 "dma_device_id": "system", 00:15:11.918 "dma_device_type": 1 00:15:11.918 }, 00:15:11.918 { 00:15:11.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.918 "dma_device_type": 2 00:15:11.918 } 00:15:11.918 ], 00:15:11.918 "driver_specific": {} 00:15:11.918 } 00:15:11.918 ] 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.918 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:12.177 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:12.177 "name": "Existed_Raid", 00:15:12.177 "uuid": "bc5f9353-6f10-493e-9d77-a9b83d44449b", 00:15:12.177 "strip_size_kb": 64, 00:15:12.177 "state": "configuring", 00:15:12.177 "raid_level": "concat", 00:15:12.177 "superblock": true, 00:15:12.177 "num_base_bdevs": 3, 00:15:12.177 "num_base_bdevs_discovered": 2, 00:15:12.177 "num_base_bdevs_operational": 3, 00:15:12.177 "base_bdevs_list": [ 00:15:12.177 { 00:15:12.177 "name": "BaseBdev1", 00:15:12.178 "uuid": "90b64f75-41a3-40f2-a51d-5c987896c1d2", 00:15:12.178 "is_configured": true, 00:15:12.178 "data_offset": 2048, 00:15:12.178 "data_size": 63488 00:15:12.178 }, 00:15:12.178 { 00:15:12.178 "name": "BaseBdev2", 00:15:12.178 "uuid": "b16de74b-4314-465b-a099-73ebac0adc37", 00:15:12.178 "is_configured": true, 00:15:12.178 "data_offset": 2048, 00:15:12.178 "data_size": 63488 00:15:12.178 }, 00:15:12.178 { 00:15:12.178 "name": "BaseBdev3", 00:15:12.178 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:12.178 "is_configured": false, 00:15:12.178 "data_offset": 0, 00:15:12.178 "data_size": 0 00:15:12.178 } 00:15:12.178 ] 00:15:12.178 }' 00:15:12.178 00:09:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:12.178 00:09:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:12.816 00:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:12.816 [2024-07-16 00:09:59.761034] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:12.816 [2024-07-16 00:09:59.761194] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e0a400 00:15:12.816 [2024-07-16 00:09:59.761207] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:12.816 [2024-07-16 00:09:59.761374] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e09ef0 00:15:12.816 [2024-07-16 00:09:59.761486] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e0a400 00:15:12.816 [2024-07-16 00:09:59.761496] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e0a400 00:15:12.816 [2024-07-16 00:09:59.761584] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:12.816 BaseBdev3 00:15:13.075 00:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:13.075 00:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:13.075 00:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:13.075 00:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:13.075 00:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:13.075 00:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:13.075 00:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:13.334 00:10:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:13.334 [ 00:15:13.334 { 00:15:13.334 "name": "BaseBdev3", 00:15:13.334 "aliases": [ 00:15:13.334 "0043ba5f-c461-4f65-b80f-ed182253a026" 00:15:13.334 ], 00:15:13.334 "product_name": "Malloc disk", 00:15:13.334 "block_size": 512, 00:15:13.334 "num_blocks": 65536, 00:15:13.334 "uuid": "0043ba5f-c461-4f65-b80f-ed182253a026", 00:15:13.334 "assigned_rate_limits": { 00:15:13.334 "rw_ios_per_sec": 0, 00:15:13.334 "rw_mbytes_per_sec": 0, 00:15:13.334 "r_mbytes_per_sec": 0, 00:15:13.334 "w_mbytes_per_sec": 0 00:15:13.334 }, 00:15:13.334 "claimed": true, 00:15:13.334 "claim_type": "exclusive_write", 00:15:13.334 "zoned": false, 00:15:13.334 "supported_io_types": { 00:15:13.334 "read": true, 00:15:13.334 "write": true, 00:15:13.334 "unmap": true, 00:15:13.334 "flush": true, 00:15:13.334 "reset": true, 00:15:13.334 "nvme_admin": false, 00:15:13.335 "nvme_io": false, 00:15:13.335 "nvme_io_md": false, 00:15:13.335 "write_zeroes": true, 00:15:13.335 "zcopy": true, 00:15:13.335 "get_zone_info": false, 00:15:13.335 "zone_management": false, 00:15:13.335 "zone_append": false, 00:15:13.335 "compare": false, 00:15:13.335 "compare_and_write": false, 00:15:13.335 "abort": true, 00:15:13.335 "seek_hole": false, 00:15:13.335 "seek_data": false, 00:15:13.335 "copy": true, 00:15:13.335 "nvme_iov_md": false 00:15:13.335 }, 00:15:13.335 "memory_domains": [ 00:15:13.335 { 00:15:13.335 "dma_device_id": "system", 00:15:13.335 "dma_device_type": 1 00:15:13.335 }, 00:15:13.335 { 00:15:13.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.335 "dma_device_type": 2 00:15:13.335 } 00:15:13.335 ], 00:15:13.335 "driver_specific": {} 00:15:13.335 } 00:15:13.335 ] 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.593 "name": "Existed_Raid", 00:15:13.593 "uuid": "bc5f9353-6f10-493e-9d77-a9b83d44449b", 00:15:13.593 "strip_size_kb": 64, 00:15:13.593 "state": "online", 00:15:13.593 "raid_level": "concat", 00:15:13.593 "superblock": true, 00:15:13.593 "num_base_bdevs": 3, 00:15:13.593 "num_base_bdevs_discovered": 3, 00:15:13.593 "num_base_bdevs_operational": 3, 00:15:13.593 "base_bdevs_list": [ 00:15:13.593 { 00:15:13.593 "name": "BaseBdev1", 00:15:13.593 "uuid": "90b64f75-41a3-40f2-a51d-5c987896c1d2", 00:15:13.593 "is_configured": true, 00:15:13.593 "data_offset": 2048, 00:15:13.593 "data_size": 63488 00:15:13.593 }, 00:15:13.593 { 00:15:13.593 "name": "BaseBdev2", 00:15:13.593 "uuid": "b16de74b-4314-465b-a099-73ebac0adc37", 00:15:13.593 "is_configured": true, 00:15:13.593 "data_offset": 2048, 00:15:13.593 "data_size": 63488 00:15:13.593 }, 00:15:13.593 { 00:15:13.593 "name": "BaseBdev3", 00:15:13.593 "uuid": "0043ba5f-c461-4f65-b80f-ed182253a026", 00:15:13.593 "is_configured": true, 00:15:13.593 "data_offset": 2048, 00:15:13.593 "data_size": 63488 00:15:13.593 } 00:15:13.593 ] 00:15:13.593 }' 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.593 00:10:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:14.527 [2024-07-16 00:10:01.297401] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:14.527 "name": "Existed_Raid", 00:15:14.527 "aliases": [ 00:15:14.527 "bc5f9353-6f10-493e-9d77-a9b83d44449b" 00:15:14.527 ], 00:15:14.527 "product_name": "Raid Volume", 00:15:14.527 "block_size": 512, 00:15:14.527 "num_blocks": 190464, 00:15:14.527 "uuid": "bc5f9353-6f10-493e-9d77-a9b83d44449b", 00:15:14.527 "assigned_rate_limits": { 00:15:14.527 "rw_ios_per_sec": 0, 00:15:14.527 "rw_mbytes_per_sec": 0, 00:15:14.527 "r_mbytes_per_sec": 0, 00:15:14.527 "w_mbytes_per_sec": 0 00:15:14.527 }, 00:15:14.527 "claimed": false, 00:15:14.527 "zoned": false, 00:15:14.527 "supported_io_types": { 00:15:14.527 "read": true, 00:15:14.527 "write": true, 00:15:14.527 "unmap": true, 00:15:14.527 "flush": true, 00:15:14.527 "reset": true, 00:15:14.527 "nvme_admin": false, 00:15:14.527 "nvme_io": false, 00:15:14.527 "nvme_io_md": false, 00:15:14.527 "write_zeroes": true, 00:15:14.527 "zcopy": false, 00:15:14.527 "get_zone_info": false, 00:15:14.527 "zone_management": false, 00:15:14.527 "zone_append": false, 00:15:14.527 "compare": false, 00:15:14.527 "compare_and_write": false, 00:15:14.527 "abort": false, 00:15:14.527 "seek_hole": false, 00:15:14.527 "seek_data": false, 00:15:14.527 "copy": false, 00:15:14.527 "nvme_iov_md": false 00:15:14.527 }, 00:15:14.527 "memory_domains": [ 00:15:14.527 { 00:15:14.527 "dma_device_id": "system", 00:15:14.527 "dma_device_type": 1 00:15:14.527 }, 00:15:14.527 { 00:15:14.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.527 "dma_device_type": 2 00:15:14.527 }, 00:15:14.527 { 00:15:14.527 "dma_device_id": "system", 00:15:14.527 "dma_device_type": 1 00:15:14.527 }, 00:15:14.527 { 00:15:14.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.527 "dma_device_type": 2 00:15:14.527 }, 00:15:14.527 { 00:15:14.527 "dma_device_id": "system", 00:15:14.527 "dma_device_type": 1 00:15:14.527 }, 00:15:14.527 { 00:15:14.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.527 "dma_device_type": 2 00:15:14.527 } 00:15:14.527 ], 00:15:14.527 "driver_specific": { 00:15:14.527 "raid": { 00:15:14.527 "uuid": "bc5f9353-6f10-493e-9d77-a9b83d44449b", 00:15:14.527 "strip_size_kb": 64, 00:15:14.527 "state": "online", 00:15:14.527 "raid_level": "concat", 00:15:14.527 "superblock": true, 00:15:14.527 "num_base_bdevs": 3, 00:15:14.527 "num_base_bdevs_discovered": 3, 00:15:14.527 "num_base_bdevs_operational": 3, 00:15:14.527 "base_bdevs_list": [ 00:15:14.527 { 00:15:14.527 "name": "BaseBdev1", 00:15:14.527 "uuid": "90b64f75-41a3-40f2-a51d-5c987896c1d2", 00:15:14.527 "is_configured": true, 00:15:14.527 "data_offset": 2048, 00:15:14.527 "data_size": 63488 00:15:14.527 }, 00:15:14.527 { 00:15:14.527 "name": "BaseBdev2", 00:15:14.527 "uuid": "b16de74b-4314-465b-a099-73ebac0adc37", 00:15:14.527 "is_configured": true, 00:15:14.527 "data_offset": 2048, 00:15:14.527 "data_size": 63488 00:15:14.527 }, 00:15:14.527 { 00:15:14.527 "name": "BaseBdev3", 00:15:14.527 "uuid": "0043ba5f-c461-4f65-b80f-ed182253a026", 00:15:14.527 "is_configured": true, 00:15:14.527 "data_offset": 2048, 00:15:14.527 "data_size": 63488 00:15:14.527 } 00:15:14.527 ] 00:15:14.527 } 00:15:14.527 } 00:15:14.527 }' 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:14.527 BaseBdev2 00:15:14.527 BaseBdev3' 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:14.527 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:14.784 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:14.784 "name": "BaseBdev1", 00:15:14.784 "aliases": [ 00:15:14.784 "90b64f75-41a3-40f2-a51d-5c987896c1d2" 00:15:14.784 ], 00:15:14.784 "product_name": "Malloc disk", 00:15:14.784 "block_size": 512, 00:15:14.784 "num_blocks": 65536, 00:15:14.784 "uuid": "90b64f75-41a3-40f2-a51d-5c987896c1d2", 00:15:14.784 "assigned_rate_limits": { 00:15:14.784 "rw_ios_per_sec": 0, 00:15:14.785 "rw_mbytes_per_sec": 0, 00:15:14.785 "r_mbytes_per_sec": 0, 00:15:14.785 "w_mbytes_per_sec": 0 00:15:14.785 }, 00:15:14.785 "claimed": true, 00:15:14.785 "claim_type": "exclusive_write", 00:15:14.785 "zoned": false, 00:15:14.785 "supported_io_types": { 00:15:14.785 "read": true, 00:15:14.785 "write": true, 00:15:14.785 "unmap": true, 00:15:14.785 "flush": true, 00:15:14.785 "reset": true, 00:15:14.785 "nvme_admin": false, 00:15:14.785 "nvme_io": false, 00:15:14.785 "nvme_io_md": false, 00:15:14.785 "write_zeroes": true, 00:15:14.785 "zcopy": true, 00:15:14.785 "get_zone_info": false, 00:15:14.785 "zone_management": false, 00:15:14.785 "zone_append": false, 00:15:14.785 "compare": false, 00:15:14.785 "compare_and_write": false, 00:15:14.785 "abort": true, 00:15:14.785 "seek_hole": false, 00:15:14.785 "seek_data": false, 00:15:14.785 "copy": true, 00:15:14.785 "nvme_iov_md": false 00:15:14.785 }, 00:15:14.785 "memory_domains": [ 00:15:14.785 { 00:15:14.785 "dma_device_id": "system", 00:15:14.785 "dma_device_type": 1 00:15:14.785 }, 00:15:14.785 { 00:15:14.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.785 "dma_device_type": 2 00:15:14.785 } 00:15:14.785 ], 00:15:14.785 "driver_specific": {} 00:15:14.785 }' 00:15:14.785 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.785 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.785 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:14.785 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:15.042 00:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:15.301 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:15.301 "name": "BaseBdev2", 00:15:15.301 "aliases": [ 00:15:15.301 "b16de74b-4314-465b-a099-73ebac0adc37" 00:15:15.301 ], 00:15:15.301 "product_name": "Malloc disk", 00:15:15.301 "block_size": 512, 00:15:15.301 "num_blocks": 65536, 00:15:15.301 "uuid": "b16de74b-4314-465b-a099-73ebac0adc37", 00:15:15.301 "assigned_rate_limits": { 00:15:15.301 "rw_ios_per_sec": 0, 00:15:15.301 "rw_mbytes_per_sec": 0, 00:15:15.301 "r_mbytes_per_sec": 0, 00:15:15.301 "w_mbytes_per_sec": 0 00:15:15.301 }, 00:15:15.301 "claimed": true, 00:15:15.301 "claim_type": "exclusive_write", 00:15:15.301 "zoned": false, 00:15:15.301 "supported_io_types": { 00:15:15.301 "read": true, 00:15:15.301 "write": true, 00:15:15.301 "unmap": true, 00:15:15.301 "flush": true, 00:15:15.301 "reset": true, 00:15:15.301 "nvme_admin": false, 00:15:15.301 "nvme_io": false, 00:15:15.301 "nvme_io_md": false, 00:15:15.301 "write_zeroes": true, 00:15:15.301 "zcopy": true, 00:15:15.301 "get_zone_info": false, 00:15:15.301 "zone_management": false, 00:15:15.301 "zone_append": false, 00:15:15.301 "compare": false, 00:15:15.301 "compare_and_write": false, 00:15:15.301 "abort": true, 00:15:15.301 "seek_hole": false, 00:15:15.301 "seek_data": false, 00:15:15.301 "copy": true, 00:15:15.301 "nvme_iov_md": false 00:15:15.301 }, 00:15:15.301 "memory_domains": [ 00:15:15.301 { 00:15:15.301 "dma_device_id": "system", 00:15:15.301 "dma_device_type": 1 00:15:15.301 }, 00:15:15.301 { 00:15:15.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.301 "dma_device_type": 2 00:15:15.301 } 00:15:15.301 ], 00:15:15.301 "driver_specific": {} 00:15:15.301 }' 00:15:15.301 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.301 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.558 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:15.558 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.558 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.558 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:15.558 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.816 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.816 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:15.816 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.816 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.816 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:15.816 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:15.816 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:15.816 00:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:16.382 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:16.382 "name": "BaseBdev3", 00:15:16.382 "aliases": [ 00:15:16.382 "0043ba5f-c461-4f65-b80f-ed182253a026" 00:15:16.382 ], 00:15:16.382 "product_name": "Malloc disk", 00:15:16.382 "block_size": 512, 00:15:16.382 "num_blocks": 65536, 00:15:16.382 "uuid": "0043ba5f-c461-4f65-b80f-ed182253a026", 00:15:16.382 "assigned_rate_limits": { 00:15:16.382 "rw_ios_per_sec": 0, 00:15:16.382 "rw_mbytes_per_sec": 0, 00:15:16.382 "r_mbytes_per_sec": 0, 00:15:16.382 "w_mbytes_per_sec": 0 00:15:16.382 }, 00:15:16.382 "claimed": true, 00:15:16.382 "claim_type": "exclusive_write", 00:15:16.382 "zoned": false, 00:15:16.382 "supported_io_types": { 00:15:16.382 "read": true, 00:15:16.382 "write": true, 00:15:16.382 "unmap": true, 00:15:16.382 "flush": true, 00:15:16.382 "reset": true, 00:15:16.382 "nvme_admin": false, 00:15:16.382 "nvme_io": false, 00:15:16.382 "nvme_io_md": false, 00:15:16.382 "write_zeroes": true, 00:15:16.382 "zcopy": true, 00:15:16.382 "get_zone_info": false, 00:15:16.382 "zone_management": false, 00:15:16.382 "zone_append": false, 00:15:16.382 "compare": false, 00:15:16.382 "compare_and_write": false, 00:15:16.382 "abort": true, 00:15:16.382 "seek_hole": false, 00:15:16.382 "seek_data": false, 00:15:16.382 "copy": true, 00:15:16.382 "nvme_iov_md": false 00:15:16.382 }, 00:15:16.382 "memory_domains": [ 00:15:16.382 { 00:15:16.382 "dma_device_id": "system", 00:15:16.382 "dma_device_type": 1 00:15:16.382 }, 00:15:16.382 { 00:15:16.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.382 "dma_device_type": 2 00:15:16.382 } 00:15:16.382 ], 00:15:16.382 "driver_specific": {} 00:15:16.382 }' 00:15:16.382 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.382 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.639 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:16.639 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.639 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.639 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:16.639 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.898 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.898 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:16.898 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.898 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.898 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:16.898 00:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:17.157 [2024-07-16 00:10:04.052494] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:17.157 [2024-07-16 00:10:04.052522] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:17.157 [2024-07-16 00:10:04.052562] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.157 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.722 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.722 "name": "Existed_Raid", 00:15:17.722 "uuid": "bc5f9353-6f10-493e-9d77-a9b83d44449b", 00:15:17.722 "strip_size_kb": 64, 00:15:17.722 "state": "offline", 00:15:17.722 "raid_level": "concat", 00:15:17.722 "superblock": true, 00:15:17.722 "num_base_bdevs": 3, 00:15:17.722 "num_base_bdevs_discovered": 2, 00:15:17.722 "num_base_bdevs_operational": 2, 00:15:17.722 "base_bdevs_list": [ 00:15:17.722 { 00:15:17.722 "name": null, 00:15:17.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.722 "is_configured": false, 00:15:17.722 "data_offset": 2048, 00:15:17.722 "data_size": 63488 00:15:17.722 }, 00:15:17.722 { 00:15:17.722 "name": "BaseBdev2", 00:15:17.722 "uuid": "b16de74b-4314-465b-a099-73ebac0adc37", 00:15:17.722 "is_configured": true, 00:15:17.722 "data_offset": 2048, 00:15:17.722 "data_size": 63488 00:15:17.722 }, 00:15:17.722 { 00:15:17.722 "name": "BaseBdev3", 00:15:17.722 "uuid": "0043ba5f-c461-4f65-b80f-ed182253a026", 00:15:17.722 "is_configured": true, 00:15:17.722 "data_offset": 2048, 00:15:17.722 "data_size": 63488 00:15:17.722 } 00:15:17.722 ] 00:15:17.722 }' 00:15:17.722 00:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.722 00:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:18.288 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:18.288 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:18.288 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.288 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:18.546 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:18.546 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:18.546 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:18.804 [2024-07-16 00:10:05.614549] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:18.804 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:18.804 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:18.804 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.804 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:19.062 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:19.062 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:19.062 00:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:19.320 [2024-07-16 00:10:06.116259] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:19.320 [2024-07-16 00:10:06.116297] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e0a400 name Existed_Raid, state offline 00:15:19.320 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:19.320 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:19.320 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.320 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:19.578 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:19.578 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:19.578 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:19.578 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:19.578 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:19.578 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:19.836 BaseBdev2 00:15:19.836 00:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:19.837 00:10:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:19.837 00:10:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:19.837 00:10:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:19.837 00:10:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:19.837 00:10:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:19.837 00:10:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:20.095 00:10:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:20.353 [ 00:15:20.353 { 00:15:20.353 "name": "BaseBdev2", 00:15:20.353 "aliases": [ 00:15:20.353 "60b43d09-21e8-4c79-af62-8743a57241ba" 00:15:20.353 ], 00:15:20.353 "product_name": "Malloc disk", 00:15:20.353 "block_size": 512, 00:15:20.353 "num_blocks": 65536, 00:15:20.353 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:20.353 "assigned_rate_limits": { 00:15:20.353 "rw_ios_per_sec": 0, 00:15:20.353 "rw_mbytes_per_sec": 0, 00:15:20.353 "r_mbytes_per_sec": 0, 00:15:20.353 "w_mbytes_per_sec": 0 00:15:20.353 }, 00:15:20.353 "claimed": false, 00:15:20.353 "zoned": false, 00:15:20.353 "supported_io_types": { 00:15:20.353 "read": true, 00:15:20.353 "write": true, 00:15:20.353 "unmap": true, 00:15:20.353 "flush": true, 00:15:20.353 "reset": true, 00:15:20.353 "nvme_admin": false, 00:15:20.353 "nvme_io": false, 00:15:20.353 "nvme_io_md": false, 00:15:20.353 "write_zeroes": true, 00:15:20.353 "zcopy": true, 00:15:20.353 "get_zone_info": false, 00:15:20.353 "zone_management": false, 00:15:20.353 "zone_append": false, 00:15:20.353 "compare": false, 00:15:20.353 "compare_and_write": false, 00:15:20.353 "abort": true, 00:15:20.353 "seek_hole": false, 00:15:20.353 "seek_data": false, 00:15:20.353 "copy": true, 00:15:20.353 "nvme_iov_md": false 00:15:20.353 }, 00:15:20.353 "memory_domains": [ 00:15:20.353 { 00:15:20.353 "dma_device_id": "system", 00:15:20.353 "dma_device_type": 1 00:15:20.353 }, 00:15:20.353 { 00:15:20.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.353 "dma_device_type": 2 00:15:20.353 } 00:15:20.353 ], 00:15:20.353 "driver_specific": {} 00:15:20.353 } 00:15:20.353 ] 00:15:20.353 00:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:20.353 00:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:20.353 00:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:20.353 00:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:20.611 BaseBdev3 00:15:20.611 00:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:20.611 00:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:20.611 00:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:20.611 00:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:20.611 00:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:20.611 00:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:20.611 00:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:20.870 00:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:21.129 [ 00:15:21.129 { 00:15:21.129 "name": "BaseBdev3", 00:15:21.129 "aliases": [ 00:15:21.129 "0bad9651-f36c-4099-874f-3be2968dc3a2" 00:15:21.129 ], 00:15:21.129 "product_name": "Malloc disk", 00:15:21.129 "block_size": 512, 00:15:21.129 "num_blocks": 65536, 00:15:21.129 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:21.129 "assigned_rate_limits": { 00:15:21.129 "rw_ios_per_sec": 0, 00:15:21.129 "rw_mbytes_per_sec": 0, 00:15:21.129 "r_mbytes_per_sec": 0, 00:15:21.129 "w_mbytes_per_sec": 0 00:15:21.129 }, 00:15:21.129 "claimed": false, 00:15:21.129 "zoned": false, 00:15:21.129 "supported_io_types": { 00:15:21.129 "read": true, 00:15:21.129 "write": true, 00:15:21.129 "unmap": true, 00:15:21.129 "flush": true, 00:15:21.129 "reset": true, 00:15:21.129 "nvme_admin": false, 00:15:21.129 "nvme_io": false, 00:15:21.129 "nvme_io_md": false, 00:15:21.129 "write_zeroes": true, 00:15:21.129 "zcopy": true, 00:15:21.129 "get_zone_info": false, 00:15:21.129 "zone_management": false, 00:15:21.129 "zone_append": false, 00:15:21.129 "compare": false, 00:15:21.129 "compare_and_write": false, 00:15:21.129 "abort": true, 00:15:21.129 "seek_hole": false, 00:15:21.129 "seek_data": false, 00:15:21.129 "copy": true, 00:15:21.129 "nvme_iov_md": false 00:15:21.129 }, 00:15:21.129 "memory_domains": [ 00:15:21.129 { 00:15:21.129 "dma_device_id": "system", 00:15:21.129 "dma_device_type": 1 00:15:21.129 }, 00:15:21.129 { 00:15:21.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.129 "dma_device_type": 2 00:15:21.129 } 00:15:21.129 ], 00:15:21.129 "driver_specific": {} 00:15:21.129 } 00:15:21.129 ] 00:15:21.129 00:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:21.129 00:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:21.129 00:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:21.129 00:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:21.388 [2024-07-16 00:10:08.100868] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:21.388 [2024-07-16 00:10:08.100911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:21.388 [2024-07-16 00:10:08.100940] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:21.388 [2024-07-16 00:10:08.102315] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.388 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.646 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.646 "name": "Existed_Raid", 00:15:21.646 "uuid": "41a90cc4-1ee7-4ab6-92a0-448d1db78231", 00:15:21.646 "strip_size_kb": 64, 00:15:21.646 "state": "configuring", 00:15:21.646 "raid_level": "concat", 00:15:21.646 "superblock": true, 00:15:21.646 "num_base_bdevs": 3, 00:15:21.646 "num_base_bdevs_discovered": 2, 00:15:21.646 "num_base_bdevs_operational": 3, 00:15:21.646 "base_bdevs_list": [ 00:15:21.646 { 00:15:21.646 "name": "BaseBdev1", 00:15:21.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.646 "is_configured": false, 00:15:21.646 "data_offset": 0, 00:15:21.646 "data_size": 0 00:15:21.646 }, 00:15:21.646 { 00:15:21.646 "name": "BaseBdev2", 00:15:21.646 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:21.646 "is_configured": true, 00:15:21.646 "data_offset": 2048, 00:15:21.646 "data_size": 63488 00:15:21.646 }, 00:15:21.646 { 00:15:21.646 "name": "BaseBdev3", 00:15:21.646 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:21.646 "is_configured": true, 00:15:21.646 "data_offset": 2048, 00:15:21.646 "data_size": 63488 00:15:21.646 } 00:15:21.646 ] 00:15:21.646 }' 00:15:21.646 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.646 00:10:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.213 00:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:22.471 [2024-07-16 00:10:09.187734] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.471 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.730 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.730 "name": "Existed_Raid", 00:15:22.730 "uuid": "41a90cc4-1ee7-4ab6-92a0-448d1db78231", 00:15:22.730 "strip_size_kb": 64, 00:15:22.730 "state": "configuring", 00:15:22.730 "raid_level": "concat", 00:15:22.730 "superblock": true, 00:15:22.730 "num_base_bdevs": 3, 00:15:22.730 "num_base_bdevs_discovered": 1, 00:15:22.730 "num_base_bdevs_operational": 3, 00:15:22.730 "base_bdevs_list": [ 00:15:22.730 { 00:15:22.730 "name": "BaseBdev1", 00:15:22.730 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:22.730 "is_configured": false, 00:15:22.730 "data_offset": 0, 00:15:22.730 "data_size": 0 00:15:22.730 }, 00:15:22.730 { 00:15:22.730 "name": null, 00:15:22.730 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:22.730 "is_configured": false, 00:15:22.730 "data_offset": 2048, 00:15:22.730 "data_size": 63488 00:15:22.730 }, 00:15:22.730 { 00:15:22.730 "name": "BaseBdev3", 00:15:22.730 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:22.730 "is_configured": true, 00:15:22.730 "data_offset": 2048, 00:15:22.730 "data_size": 63488 00:15:22.730 } 00:15:22.730 ] 00:15:22.730 }' 00:15:22.730 00:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.730 00:10:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:23.296 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.296 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:23.555 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:23.555 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:23.555 [2024-07-16 00:10:10.434396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:23.555 BaseBdev1 00:15:23.555 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:23.555 00:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:23.555 00:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:23.555 00:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:23.555 00:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:23.555 00:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:23.555 00:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:23.813 00:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:24.072 [ 00:15:24.072 { 00:15:24.072 "name": "BaseBdev1", 00:15:24.072 "aliases": [ 00:15:24.072 "7b6476c4-aa68-46c1-bffe-0821eb43d345" 00:15:24.072 ], 00:15:24.072 "product_name": "Malloc disk", 00:15:24.072 "block_size": 512, 00:15:24.072 "num_blocks": 65536, 00:15:24.072 "uuid": "7b6476c4-aa68-46c1-bffe-0821eb43d345", 00:15:24.072 "assigned_rate_limits": { 00:15:24.072 "rw_ios_per_sec": 0, 00:15:24.072 "rw_mbytes_per_sec": 0, 00:15:24.072 "r_mbytes_per_sec": 0, 00:15:24.072 "w_mbytes_per_sec": 0 00:15:24.072 }, 00:15:24.072 "claimed": true, 00:15:24.072 "claim_type": "exclusive_write", 00:15:24.072 "zoned": false, 00:15:24.072 "supported_io_types": { 00:15:24.072 "read": true, 00:15:24.072 "write": true, 00:15:24.072 "unmap": true, 00:15:24.072 "flush": true, 00:15:24.072 "reset": true, 00:15:24.072 "nvme_admin": false, 00:15:24.072 "nvme_io": false, 00:15:24.072 "nvme_io_md": false, 00:15:24.072 "write_zeroes": true, 00:15:24.072 "zcopy": true, 00:15:24.072 "get_zone_info": false, 00:15:24.072 "zone_management": false, 00:15:24.072 "zone_append": false, 00:15:24.072 "compare": false, 00:15:24.072 "compare_and_write": false, 00:15:24.072 "abort": true, 00:15:24.072 "seek_hole": false, 00:15:24.072 "seek_data": false, 00:15:24.072 "copy": true, 00:15:24.072 "nvme_iov_md": false 00:15:24.072 }, 00:15:24.072 "memory_domains": [ 00:15:24.072 { 00:15:24.072 "dma_device_id": "system", 00:15:24.072 "dma_device_type": 1 00:15:24.072 }, 00:15:24.072 { 00:15:24.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.072 "dma_device_type": 2 00:15:24.072 } 00:15:24.072 ], 00:15:24.072 "driver_specific": {} 00:15:24.072 } 00:15:24.072 ] 00:15:24.072 00:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:24.072 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:24.072 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.072 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:24.073 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.073 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.073 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.073 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.073 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.073 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.073 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.073 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.073 00:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.330 00:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.330 "name": "Existed_Raid", 00:15:24.330 "uuid": "41a90cc4-1ee7-4ab6-92a0-448d1db78231", 00:15:24.330 "strip_size_kb": 64, 00:15:24.330 "state": "configuring", 00:15:24.330 "raid_level": "concat", 00:15:24.330 "superblock": true, 00:15:24.330 "num_base_bdevs": 3, 00:15:24.330 "num_base_bdevs_discovered": 2, 00:15:24.330 "num_base_bdevs_operational": 3, 00:15:24.330 "base_bdevs_list": [ 00:15:24.330 { 00:15:24.330 "name": "BaseBdev1", 00:15:24.330 "uuid": "7b6476c4-aa68-46c1-bffe-0821eb43d345", 00:15:24.330 "is_configured": true, 00:15:24.330 "data_offset": 2048, 00:15:24.330 "data_size": 63488 00:15:24.330 }, 00:15:24.330 { 00:15:24.330 "name": null, 00:15:24.330 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:24.330 "is_configured": false, 00:15:24.330 "data_offset": 2048, 00:15:24.330 "data_size": 63488 00:15:24.330 }, 00:15:24.330 { 00:15:24.330 "name": "BaseBdev3", 00:15:24.330 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:24.330 "is_configured": true, 00:15:24.330 "data_offset": 2048, 00:15:24.330 "data_size": 63488 00:15:24.330 } 00:15:24.330 ] 00:15:24.330 }' 00:15:24.330 00:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.330 00:10:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:24.895 00:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:24.895 00:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.153 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:25.153 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:25.719 [2024-07-16 00:10:12.532061] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.720 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.978 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.978 "name": "Existed_Raid", 00:15:25.978 "uuid": "41a90cc4-1ee7-4ab6-92a0-448d1db78231", 00:15:25.978 "strip_size_kb": 64, 00:15:25.978 "state": "configuring", 00:15:25.978 "raid_level": "concat", 00:15:25.978 "superblock": true, 00:15:25.978 "num_base_bdevs": 3, 00:15:25.978 "num_base_bdevs_discovered": 1, 00:15:25.978 "num_base_bdevs_operational": 3, 00:15:25.978 "base_bdevs_list": [ 00:15:25.978 { 00:15:25.978 "name": "BaseBdev1", 00:15:25.978 "uuid": "7b6476c4-aa68-46c1-bffe-0821eb43d345", 00:15:25.978 "is_configured": true, 00:15:25.978 "data_offset": 2048, 00:15:25.978 "data_size": 63488 00:15:25.978 }, 00:15:25.978 { 00:15:25.978 "name": null, 00:15:25.978 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:25.978 "is_configured": false, 00:15:25.978 "data_offset": 2048, 00:15:25.978 "data_size": 63488 00:15:25.978 }, 00:15:25.978 { 00:15:25.978 "name": null, 00:15:25.978 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:25.978 "is_configured": false, 00:15:25.978 "data_offset": 2048, 00:15:25.978 "data_size": 63488 00:15:25.978 } 00:15:25.978 ] 00:15:25.978 }' 00:15:25.978 00:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.978 00:10:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:26.545 00:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.545 00:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:26.804 00:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:26.804 00:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:27.372 [2024-07-16 00:10:14.148370] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.372 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:27.631 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:27.631 "name": "Existed_Raid", 00:15:27.631 "uuid": "41a90cc4-1ee7-4ab6-92a0-448d1db78231", 00:15:27.631 "strip_size_kb": 64, 00:15:27.631 "state": "configuring", 00:15:27.631 "raid_level": "concat", 00:15:27.631 "superblock": true, 00:15:27.631 "num_base_bdevs": 3, 00:15:27.631 "num_base_bdevs_discovered": 2, 00:15:27.631 "num_base_bdevs_operational": 3, 00:15:27.631 "base_bdevs_list": [ 00:15:27.631 { 00:15:27.631 "name": "BaseBdev1", 00:15:27.631 "uuid": "7b6476c4-aa68-46c1-bffe-0821eb43d345", 00:15:27.631 "is_configured": true, 00:15:27.631 "data_offset": 2048, 00:15:27.631 "data_size": 63488 00:15:27.631 }, 00:15:27.631 { 00:15:27.631 "name": null, 00:15:27.631 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:27.631 "is_configured": false, 00:15:27.631 "data_offset": 2048, 00:15:27.631 "data_size": 63488 00:15:27.631 }, 00:15:27.631 { 00:15:27.631 "name": "BaseBdev3", 00:15:27.631 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:27.631 "is_configured": true, 00:15:27.631 "data_offset": 2048, 00:15:27.631 "data_size": 63488 00:15:27.631 } 00:15:27.631 ] 00:15:27.631 }' 00:15:27.631 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:27.631 00:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:28.199 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.199 00:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:28.199 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:28.199 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:28.458 [2024-07-16 00:10:15.319500] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.458 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.717 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.717 "name": "Existed_Raid", 00:15:28.717 "uuid": "41a90cc4-1ee7-4ab6-92a0-448d1db78231", 00:15:28.717 "strip_size_kb": 64, 00:15:28.717 "state": "configuring", 00:15:28.717 "raid_level": "concat", 00:15:28.717 "superblock": true, 00:15:28.717 "num_base_bdevs": 3, 00:15:28.717 "num_base_bdevs_discovered": 1, 00:15:28.718 "num_base_bdevs_operational": 3, 00:15:28.718 "base_bdevs_list": [ 00:15:28.718 { 00:15:28.718 "name": null, 00:15:28.718 "uuid": "7b6476c4-aa68-46c1-bffe-0821eb43d345", 00:15:28.718 "is_configured": false, 00:15:28.718 "data_offset": 2048, 00:15:28.718 "data_size": 63488 00:15:28.718 }, 00:15:28.718 { 00:15:28.718 "name": null, 00:15:28.718 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:28.718 "is_configured": false, 00:15:28.718 "data_offset": 2048, 00:15:28.718 "data_size": 63488 00:15:28.718 }, 00:15:28.718 { 00:15:28.718 "name": "BaseBdev3", 00:15:28.718 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:28.718 "is_configured": true, 00:15:28.718 "data_offset": 2048, 00:15:28.718 "data_size": 63488 00:15:28.718 } 00:15:28.718 ] 00:15:28.718 }' 00:15:28.718 00:10:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.718 00:10:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:29.287 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.287 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:29.613 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:29.613 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:29.872 [2024-07-16 00:10:16.667525] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:29.872 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:29.872 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.872 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:29.872 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:29.872 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.872 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:29.872 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.872 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.872 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.872 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.873 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.873 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.132 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.132 "name": "Existed_Raid", 00:15:30.132 "uuid": "41a90cc4-1ee7-4ab6-92a0-448d1db78231", 00:15:30.132 "strip_size_kb": 64, 00:15:30.132 "state": "configuring", 00:15:30.132 "raid_level": "concat", 00:15:30.132 "superblock": true, 00:15:30.132 "num_base_bdevs": 3, 00:15:30.132 "num_base_bdevs_discovered": 2, 00:15:30.132 "num_base_bdevs_operational": 3, 00:15:30.132 "base_bdevs_list": [ 00:15:30.132 { 00:15:30.132 "name": null, 00:15:30.132 "uuid": "7b6476c4-aa68-46c1-bffe-0821eb43d345", 00:15:30.132 "is_configured": false, 00:15:30.133 "data_offset": 2048, 00:15:30.133 "data_size": 63488 00:15:30.133 }, 00:15:30.133 { 00:15:30.133 "name": "BaseBdev2", 00:15:30.133 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:30.133 "is_configured": true, 00:15:30.133 "data_offset": 2048, 00:15:30.133 "data_size": 63488 00:15:30.133 }, 00:15:30.133 { 00:15:30.133 "name": "BaseBdev3", 00:15:30.133 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:30.133 "is_configured": true, 00:15:30.133 "data_offset": 2048, 00:15:30.133 "data_size": 63488 00:15:30.133 } 00:15:30.133 ] 00:15:30.133 }' 00:15:30.133 00:10:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.133 00:10:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:30.702 00:10:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.702 00:10:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:30.962 00:10:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:30.962 00:10:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.962 00:10:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:31.221 00:10:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7b6476c4-aa68-46c1-bffe-0821eb43d345 00:15:31.480 [2024-07-16 00:10:18.226971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:31.480 [2024-07-16 00:10:18.227114] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e08f50 00:15:31.480 [2024-07-16 00:10:18.227127] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:31.480 [2024-07-16 00:10:18.227299] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b0f940 00:15:31.480 [2024-07-16 00:10:18.227411] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e08f50 00:15:31.480 [2024-07-16 00:10:18.227421] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e08f50 00:15:31.480 [2024-07-16 00:10:18.227510] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:31.480 NewBaseBdev 00:15:31.480 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:31.480 00:10:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:31.480 00:10:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:31.480 00:10:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:31.480 00:10:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:31.480 00:10:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:31.480 00:10:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:31.739 00:10:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:31.998 [ 00:15:31.998 { 00:15:31.998 "name": "NewBaseBdev", 00:15:31.998 "aliases": [ 00:15:31.998 "7b6476c4-aa68-46c1-bffe-0821eb43d345" 00:15:31.998 ], 00:15:31.998 "product_name": "Malloc disk", 00:15:31.998 "block_size": 512, 00:15:31.998 "num_blocks": 65536, 00:15:31.998 "uuid": "7b6476c4-aa68-46c1-bffe-0821eb43d345", 00:15:31.998 "assigned_rate_limits": { 00:15:31.998 "rw_ios_per_sec": 0, 00:15:31.998 "rw_mbytes_per_sec": 0, 00:15:31.998 "r_mbytes_per_sec": 0, 00:15:31.998 "w_mbytes_per_sec": 0 00:15:31.998 }, 00:15:31.998 "claimed": true, 00:15:31.998 "claim_type": "exclusive_write", 00:15:31.998 "zoned": false, 00:15:31.998 "supported_io_types": { 00:15:31.998 "read": true, 00:15:31.998 "write": true, 00:15:31.998 "unmap": true, 00:15:31.998 "flush": true, 00:15:31.998 "reset": true, 00:15:31.998 "nvme_admin": false, 00:15:31.998 "nvme_io": false, 00:15:31.998 "nvme_io_md": false, 00:15:31.998 "write_zeroes": true, 00:15:31.998 "zcopy": true, 00:15:31.998 "get_zone_info": false, 00:15:31.998 "zone_management": false, 00:15:31.998 "zone_append": false, 00:15:31.998 "compare": false, 00:15:31.998 "compare_and_write": false, 00:15:31.998 "abort": true, 00:15:31.998 "seek_hole": false, 00:15:31.998 "seek_data": false, 00:15:31.998 "copy": true, 00:15:31.998 "nvme_iov_md": false 00:15:31.998 }, 00:15:31.998 "memory_domains": [ 00:15:31.998 { 00:15:31.998 "dma_device_id": "system", 00:15:31.998 "dma_device_type": 1 00:15:31.998 }, 00:15:31.998 { 00:15:31.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.999 "dma_device_type": 2 00:15:31.999 } 00:15:31.999 ], 00:15:31.999 "driver_specific": {} 00:15:31.999 } 00:15:31.999 ] 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.999 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.257 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.257 "name": "Existed_Raid", 00:15:32.257 "uuid": "41a90cc4-1ee7-4ab6-92a0-448d1db78231", 00:15:32.257 "strip_size_kb": 64, 00:15:32.257 "state": "online", 00:15:32.257 "raid_level": "concat", 00:15:32.257 "superblock": true, 00:15:32.257 "num_base_bdevs": 3, 00:15:32.257 "num_base_bdevs_discovered": 3, 00:15:32.257 "num_base_bdevs_operational": 3, 00:15:32.257 "base_bdevs_list": [ 00:15:32.257 { 00:15:32.257 "name": "NewBaseBdev", 00:15:32.257 "uuid": "7b6476c4-aa68-46c1-bffe-0821eb43d345", 00:15:32.257 "is_configured": true, 00:15:32.257 "data_offset": 2048, 00:15:32.257 "data_size": 63488 00:15:32.257 }, 00:15:32.257 { 00:15:32.257 "name": "BaseBdev2", 00:15:32.257 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:32.257 "is_configured": true, 00:15:32.257 "data_offset": 2048, 00:15:32.257 "data_size": 63488 00:15:32.257 }, 00:15:32.257 { 00:15:32.257 "name": "BaseBdev3", 00:15:32.257 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:32.257 "is_configured": true, 00:15:32.257 "data_offset": 2048, 00:15:32.257 "data_size": 63488 00:15:32.258 } 00:15:32.258 ] 00:15:32.258 }' 00:15:32.258 00:10:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.258 00:10:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:32.826 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:32.826 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:32.826 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:32.826 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:32.826 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:32.826 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:32.826 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:32.826 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:33.085 [2024-07-16 00:10:19.787405] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:33.085 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:33.085 "name": "Existed_Raid", 00:15:33.085 "aliases": [ 00:15:33.085 "41a90cc4-1ee7-4ab6-92a0-448d1db78231" 00:15:33.085 ], 00:15:33.085 "product_name": "Raid Volume", 00:15:33.085 "block_size": 512, 00:15:33.085 "num_blocks": 190464, 00:15:33.085 "uuid": "41a90cc4-1ee7-4ab6-92a0-448d1db78231", 00:15:33.085 "assigned_rate_limits": { 00:15:33.085 "rw_ios_per_sec": 0, 00:15:33.085 "rw_mbytes_per_sec": 0, 00:15:33.085 "r_mbytes_per_sec": 0, 00:15:33.085 "w_mbytes_per_sec": 0 00:15:33.085 }, 00:15:33.085 "claimed": false, 00:15:33.085 "zoned": false, 00:15:33.085 "supported_io_types": { 00:15:33.085 "read": true, 00:15:33.085 "write": true, 00:15:33.085 "unmap": true, 00:15:33.085 "flush": true, 00:15:33.085 "reset": true, 00:15:33.085 "nvme_admin": false, 00:15:33.085 "nvme_io": false, 00:15:33.085 "nvme_io_md": false, 00:15:33.085 "write_zeroes": true, 00:15:33.085 "zcopy": false, 00:15:33.085 "get_zone_info": false, 00:15:33.085 "zone_management": false, 00:15:33.085 "zone_append": false, 00:15:33.085 "compare": false, 00:15:33.085 "compare_and_write": false, 00:15:33.085 "abort": false, 00:15:33.085 "seek_hole": false, 00:15:33.085 "seek_data": false, 00:15:33.085 "copy": false, 00:15:33.085 "nvme_iov_md": false 00:15:33.085 }, 00:15:33.085 "memory_domains": [ 00:15:33.085 { 00:15:33.085 "dma_device_id": "system", 00:15:33.085 "dma_device_type": 1 00:15:33.085 }, 00:15:33.085 { 00:15:33.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.085 "dma_device_type": 2 00:15:33.085 }, 00:15:33.085 { 00:15:33.085 "dma_device_id": "system", 00:15:33.085 "dma_device_type": 1 00:15:33.085 }, 00:15:33.085 { 00:15:33.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.085 "dma_device_type": 2 00:15:33.085 }, 00:15:33.085 { 00:15:33.085 "dma_device_id": "system", 00:15:33.085 "dma_device_type": 1 00:15:33.085 }, 00:15:33.085 { 00:15:33.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.085 "dma_device_type": 2 00:15:33.085 } 00:15:33.085 ], 00:15:33.085 "driver_specific": { 00:15:33.085 "raid": { 00:15:33.085 "uuid": "41a90cc4-1ee7-4ab6-92a0-448d1db78231", 00:15:33.085 "strip_size_kb": 64, 00:15:33.085 "state": "online", 00:15:33.085 "raid_level": "concat", 00:15:33.085 "superblock": true, 00:15:33.085 "num_base_bdevs": 3, 00:15:33.085 "num_base_bdevs_discovered": 3, 00:15:33.085 "num_base_bdevs_operational": 3, 00:15:33.085 "base_bdevs_list": [ 00:15:33.085 { 00:15:33.085 "name": "NewBaseBdev", 00:15:33.086 "uuid": "7b6476c4-aa68-46c1-bffe-0821eb43d345", 00:15:33.086 "is_configured": true, 00:15:33.086 "data_offset": 2048, 00:15:33.086 "data_size": 63488 00:15:33.086 }, 00:15:33.086 { 00:15:33.086 "name": "BaseBdev2", 00:15:33.086 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:33.086 "is_configured": true, 00:15:33.086 "data_offset": 2048, 00:15:33.086 "data_size": 63488 00:15:33.086 }, 00:15:33.086 { 00:15:33.086 "name": "BaseBdev3", 00:15:33.086 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:33.086 "is_configured": true, 00:15:33.086 "data_offset": 2048, 00:15:33.086 "data_size": 63488 00:15:33.086 } 00:15:33.086 ] 00:15:33.086 } 00:15:33.086 } 00:15:33.086 }' 00:15:33.086 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:33.086 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:33.086 BaseBdev2 00:15:33.086 BaseBdev3' 00:15:33.086 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:33.086 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:33.086 00:10:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:33.345 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:33.345 "name": "NewBaseBdev", 00:15:33.345 "aliases": [ 00:15:33.345 "7b6476c4-aa68-46c1-bffe-0821eb43d345" 00:15:33.345 ], 00:15:33.345 "product_name": "Malloc disk", 00:15:33.345 "block_size": 512, 00:15:33.345 "num_blocks": 65536, 00:15:33.345 "uuid": "7b6476c4-aa68-46c1-bffe-0821eb43d345", 00:15:33.345 "assigned_rate_limits": { 00:15:33.345 "rw_ios_per_sec": 0, 00:15:33.345 "rw_mbytes_per_sec": 0, 00:15:33.345 "r_mbytes_per_sec": 0, 00:15:33.345 "w_mbytes_per_sec": 0 00:15:33.345 }, 00:15:33.345 "claimed": true, 00:15:33.345 "claim_type": "exclusive_write", 00:15:33.345 "zoned": false, 00:15:33.345 "supported_io_types": { 00:15:33.345 "read": true, 00:15:33.345 "write": true, 00:15:33.345 "unmap": true, 00:15:33.345 "flush": true, 00:15:33.345 "reset": true, 00:15:33.345 "nvme_admin": false, 00:15:33.345 "nvme_io": false, 00:15:33.345 "nvme_io_md": false, 00:15:33.345 "write_zeroes": true, 00:15:33.345 "zcopy": true, 00:15:33.345 "get_zone_info": false, 00:15:33.345 "zone_management": false, 00:15:33.345 "zone_append": false, 00:15:33.345 "compare": false, 00:15:33.345 "compare_and_write": false, 00:15:33.345 "abort": true, 00:15:33.345 "seek_hole": false, 00:15:33.345 "seek_data": false, 00:15:33.345 "copy": true, 00:15:33.345 "nvme_iov_md": false 00:15:33.345 }, 00:15:33.345 "memory_domains": [ 00:15:33.345 { 00:15:33.345 "dma_device_id": "system", 00:15:33.345 "dma_device_type": 1 00:15:33.345 }, 00:15:33.345 { 00:15:33.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.345 "dma_device_type": 2 00:15:33.345 } 00:15:33.345 ], 00:15:33.345 "driver_specific": {} 00:15:33.345 }' 00:15:33.345 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.345 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.345 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:33.345 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.345 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.345 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:33.345 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.604 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.604 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.604 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.604 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.604 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.604 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:33.604 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:33.604 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:33.863 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:33.863 "name": "BaseBdev2", 00:15:33.863 "aliases": [ 00:15:33.863 "60b43d09-21e8-4c79-af62-8743a57241ba" 00:15:33.863 ], 00:15:33.863 "product_name": "Malloc disk", 00:15:33.863 "block_size": 512, 00:15:33.863 "num_blocks": 65536, 00:15:33.863 "uuid": "60b43d09-21e8-4c79-af62-8743a57241ba", 00:15:33.864 "assigned_rate_limits": { 00:15:33.864 "rw_ios_per_sec": 0, 00:15:33.864 "rw_mbytes_per_sec": 0, 00:15:33.864 "r_mbytes_per_sec": 0, 00:15:33.864 "w_mbytes_per_sec": 0 00:15:33.864 }, 00:15:33.864 "claimed": true, 00:15:33.864 "claim_type": "exclusive_write", 00:15:33.864 "zoned": false, 00:15:33.864 "supported_io_types": { 00:15:33.864 "read": true, 00:15:33.864 "write": true, 00:15:33.864 "unmap": true, 00:15:33.864 "flush": true, 00:15:33.864 "reset": true, 00:15:33.864 "nvme_admin": false, 00:15:33.864 "nvme_io": false, 00:15:33.864 "nvme_io_md": false, 00:15:33.864 "write_zeroes": true, 00:15:33.864 "zcopy": true, 00:15:33.864 "get_zone_info": false, 00:15:33.864 "zone_management": false, 00:15:33.864 "zone_append": false, 00:15:33.864 "compare": false, 00:15:33.864 "compare_and_write": false, 00:15:33.864 "abort": true, 00:15:33.864 "seek_hole": false, 00:15:33.864 "seek_data": false, 00:15:33.864 "copy": true, 00:15:33.864 "nvme_iov_md": false 00:15:33.864 }, 00:15:33.864 "memory_domains": [ 00:15:33.864 { 00:15:33.864 "dma_device_id": "system", 00:15:33.864 "dma_device_type": 1 00:15:33.864 }, 00:15:33.864 { 00:15:33.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.864 "dma_device_type": 2 00:15:33.864 } 00:15:33.864 ], 00:15:33.864 "driver_specific": {} 00:15:33.864 }' 00:15:33.864 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.864 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.864 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:33.864 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.122 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.122 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:34.122 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.122 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.122 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:34.122 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.122 00:10:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.122 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:34.122 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:34.122 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:34.122 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:34.381 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:34.381 "name": "BaseBdev3", 00:15:34.381 "aliases": [ 00:15:34.381 "0bad9651-f36c-4099-874f-3be2968dc3a2" 00:15:34.381 ], 00:15:34.381 "product_name": "Malloc disk", 00:15:34.381 "block_size": 512, 00:15:34.381 "num_blocks": 65536, 00:15:34.381 "uuid": "0bad9651-f36c-4099-874f-3be2968dc3a2", 00:15:34.381 "assigned_rate_limits": { 00:15:34.381 "rw_ios_per_sec": 0, 00:15:34.381 "rw_mbytes_per_sec": 0, 00:15:34.381 "r_mbytes_per_sec": 0, 00:15:34.381 "w_mbytes_per_sec": 0 00:15:34.381 }, 00:15:34.381 "claimed": true, 00:15:34.381 "claim_type": "exclusive_write", 00:15:34.381 "zoned": false, 00:15:34.381 "supported_io_types": { 00:15:34.381 "read": true, 00:15:34.381 "write": true, 00:15:34.381 "unmap": true, 00:15:34.381 "flush": true, 00:15:34.381 "reset": true, 00:15:34.381 "nvme_admin": false, 00:15:34.381 "nvme_io": false, 00:15:34.381 "nvme_io_md": false, 00:15:34.381 "write_zeroes": true, 00:15:34.381 "zcopy": true, 00:15:34.381 "get_zone_info": false, 00:15:34.381 "zone_management": false, 00:15:34.381 "zone_append": false, 00:15:34.381 "compare": false, 00:15:34.381 "compare_and_write": false, 00:15:34.381 "abort": true, 00:15:34.381 "seek_hole": false, 00:15:34.381 "seek_data": false, 00:15:34.381 "copy": true, 00:15:34.381 "nvme_iov_md": false 00:15:34.381 }, 00:15:34.381 "memory_domains": [ 00:15:34.381 { 00:15:34.381 "dma_device_id": "system", 00:15:34.381 "dma_device_type": 1 00:15:34.381 }, 00:15:34.381 { 00:15:34.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.381 "dma_device_type": 2 00:15:34.381 } 00:15:34.381 ], 00:15:34.381 "driver_specific": {} 00:15:34.381 }' 00:15:34.381 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.381 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.640 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:34.640 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.640 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.640 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:34.640 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.640 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.640 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:34.640 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.899 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.899 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:34.899 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:34.899 [2024-07-16 00:10:21.784473] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:34.899 [2024-07-16 00:10:21.784499] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:34.899 [2024-07-16 00:10:21.784547] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:34.899 [2024-07-16 00:10:21.784594] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:34.899 [2024-07-16 00:10:21.784605] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e08f50 name Existed_Raid, state offline 00:15:34.899 00:10:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3526728 00:15:34.899 00:10:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3526728 ']' 00:15:34.899 00:10:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3526728 00:15:34.899 00:10:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:34.899 00:10:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:34.899 00:10:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3526728 00:15:35.158 00:10:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:35.158 00:10:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:35.158 00:10:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3526728' 00:15:35.158 killing process with pid 3526728 00:15:35.158 00:10:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3526728 00:15:35.158 [2024-07-16 00:10:21.858987] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:35.158 00:10:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3526728 00:15:35.158 [2024-07-16 00:10:21.885546] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:35.158 00:10:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:35.158 00:15:35.158 real 0m29.861s 00:15:35.158 user 0m54.849s 00:15:35.158 sys 0m5.304s 00:15:35.158 00:10:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:35.158 00:10:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:35.158 ************************************ 00:15:35.158 END TEST raid_state_function_test_sb 00:15:35.158 ************************************ 00:15:35.417 00:10:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:35.417 00:10:22 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:15:35.417 00:10:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:35.417 00:10:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:35.417 00:10:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:35.417 ************************************ 00:15:35.417 START TEST raid_superblock_test 00:15:35.417 ************************************ 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3531188 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3531188 /var/tmp/spdk-raid.sock 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3531188 ']' 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:35.417 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:35.417 00:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.417 [2024-07-16 00:10:22.255615] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:15:35.417 [2024-07-16 00:10:22.255687] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3531188 ] 00:15:35.676 [2024-07-16 00:10:22.388077] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:35.676 [2024-07-16 00:10:22.493165] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:35.676 [2024-07-16 00:10:22.556909] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:35.676 [2024-07-16 00:10:22.556957] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:36.244 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:36.502 malloc1 00:15:36.502 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:36.761 [2024-07-16 00:10:23.458780] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:36.761 [2024-07-16 00:10:23.458827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:36.761 [2024-07-16 00:10:23.458846] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xab0570 00:15:36.761 [2024-07-16 00:10:23.458860] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:36.761 [2024-07-16 00:10:23.460494] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:36.761 [2024-07-16 00:10:23.460526] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:36.761 pt1 00:15:36.761 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:36.761 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:36.761 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:36.761 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:36.761 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:36.761 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:36.761 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:36.761 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:36.761 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:36.761 malloc2 00:15:36.761 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:37.024 [2024-07-16 00:10:23.808525] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:37.024 [2024-07-16 00:10:23.808574] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:37.024 [2024-07-16 00:10:23.808591] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xab1970 00:15:37.024 [2024-07-16 00:10:23.808603] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:37.024 [2024-07-16 00:10:23.810114] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:37.024 [2024-07-16 00:10:23.810144] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:37.024 pt2 00:15:37.024 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:37.024 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:37.024 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:37.024 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:37.024 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:37.024 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:37.024 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:37.024 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:37.024 00:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:37.284 malloc3 00:15:37.284 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:37.284 [2024-07-16 00:10:24.154053] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:37.284 [2024-07-16 00:10:24.154102] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:37.284 [2024-07-16 00:10:24.154119] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc48340 00:15:37.284 [2024-07-16 00:10:24.154132] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:37.284 [2024-07-16 00:10:24.155616] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:37.284 [2024-07-16 00:10:24.155645] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:37.284 pt3 00:15:37.284 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:37.284 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:37.284 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:37.543 [2024-07-16 00:10:24.394712] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:37.543 [2024-07-16 00:10:24.395941] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:37.543 [2024-07-16 00:10:24.395997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:37.543 [2024-07-16 00:10:24.396149] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xaa8ea0 00:15:37.543 [2024-07-16 00:10:24.396160] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:37.543 [2024-07-16 00:10:24.396361] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xab0240 00:15:37.543 [2024-07-16 00:10:24.396503] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaa8ea0 00:15:37.543 [2024-07-16 00:10:24.396514] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xaa8ea0 00:15:37.543 [2024-07-16 00:10:24.396610] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.543 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:37.803 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.803 "name": "raid_bdev1", 00:15:37.803 "uuid": "40bb11c6-9f84-444a-a0f3-6d6f20a31c1a", 00:15:37.803 "strip_size_kb": 64, 00:15:37.803 "state": "online", 00:15:37.803 "raid_level": "concat", 00:15:37.803 "superblock": true, 00:15:37.803 "num_base_bdevs": 3, 00:15:37.803 "num_base_bdevs_discovered": 3, 00:15:37.803 "num_base_bdevs_operational": 3, 00:15:37.803 "base_bdevs_list": [ 00:15:37.803 { 00:15:37.803 "name": "pt1", 00:15:37.803 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:37.803 "is_configured": true, 00:15:37.803 "data_offset": 2048, 00:15:37.803 "data_size": 63488 00:15:37.803 }, 00:15:37.803 { 00:15:37.803 "name": "pt2", 00:15:37.803 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:37.803 "is_configured": true, 00:15:37.803 "data_offset": 2048, 00:15:37.803 "data_size": 63488 00:15:37.803 }, 00:15:37.803 { 00:15:37.803 "name": "pt3", 00:15:37.803 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:37.803 "is_configured": true, 00:15:37.803 "data_offset": 2048, 00:15:37.803 "data_size": 63488 00:15:37.803 } 00:15:37.803 ] 00:15:37.803 }' 00:15:37.803 00:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.803 00:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.372 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:38.372 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:38.372 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:38.372 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:38.372 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:38.372 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:38.372 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:38.372 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:38.631 [2024-07-16 00:10:25.349494] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:38.631 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:38.631 "name": "raid_bdev1", 00:15:38.631 "aliases": [ 00:15:38.631 "40bb11c6-9f84-444a-a0f3-6d6f20a31c1a" 00:15:38.631 ], 00:15:38.631 "product_name": "Raid Volume", 00:15:38.631 "block_size": 512, 00:15:38.631 "num_blocks": 190464, 00:15:38.631 "uuid": "40bb11c6-9f84-444a-a0f3-6d6f20a31c1a", 00:15:38.631 "assigned_rate_limits": { 00:15:38.631 "rw_ios_per_sec": 0, 00:15:38.631 "rw_mbytes_per_sec": 0, 00:15:38.631 "r_mbytes_per_sec": 0, 00:15:38.631 "w_mbytes_per_sec": 0 00:15:38.631 }, 00:15:38.631 "claimed": false, 00:15:38.631 "zoned": false, 00:15:38.631 "supported_io_types": { 00:15:38.631 "read": true, 00:15:38.631 "write": true, 00:15:38.631 "unmap": true, 00:15:38.631 "flush": true, 00:15:38.631 "reset": true, 00:15:38.631 "nvme_admin": false, 00:15:38.631 "nvme_io": false, 00:15:38.631 "nvme_io_md": false, 00:15:38.631 "write_zeroes": true, 00:15:38.631 "zcopy": false, 00:15:38.631 "get_zone_info": false, 00:15:38.631 "zone_management": false, 00:15:38.631 "zone_append": false, 00:15:38.631 "compare": false, 00:15:38.631 "compare_and_write": false, 00:15:38.631 "abort": false, 00:15:38.631 "seek_hole": false, 00:15:38.631 "seek_data": false, 00:15:38.631 "copy": false, 00:15:38.631 "nvme_iov_md": false 00:15:38.631 }, 00:15:38.631 "memory_domains": [ 00:15:38.631 { 00:15:38.631 "dma_device_id": "system", 00:15:38.631 "dma_device_type": 1 00:15:38.631 }, 00:15:38.631 { 00:15:38.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.631 "dma_device_type": 2 00:15:38.631 }, 00:15:38.632 { 00:15:38.632 "dma_device_id": "system", 00:15:38.632 "dma_device_type": 1 00:15:38.632 }, 00:15:38.632 { 00:15:38.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.632 "dma_device_type": 2 00:15:38.632 }, 00:15:38.632 { 00:15:38.632 "dma_device_id": "system", 00:15:38.632 "dma_device_type": 1 00:15:38.632 }, 00:15:38.632 { 00:15:38.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.632 "dma_device_type": 2 00:15:38.632 } 00:15:38.632 ], 00:15:38.632 "driver_specific": { 00:15:38.632 "raid": { 00:15:38.632 "uuid": "40bb11c6-9f84-444a-a0f3-6d6f20a31c1a", 00:15:38.632 "strip_size_kb": 64, 00:15:38.632 "state": "online", 00:15:38.632 "raid_level": "concat", 00:15:38.632 "superblock": true, 00:15:38.632 "num_base_bdevs": 3, 00:15:38.632 "num_base_bdevs_discovered": 3, 00:15:38.632 "num_base_bdevs_operational": 3, 00:15:38.632 "base_bdevs_list": [ 00:15:38.632 { 00:15:38.632 "name": "pt1", 00:15:38.632 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.632 "is_configured": true, 00:15:38.632 "data_offset": 2048, 00:15:38.632 "data_size": 63488 00:15:38.632 }, 00:15:38.632 { 00:15:38.632 "name": "pt2", 00:15:38.632 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:38.632 "is_configured": true, 00:15:38.632 "data_offset": 2048, 00:15:38.632 "data_size": 63488 00:15:38.632 }, 00:15:38.632 { 00:15:38.632 "name": "pt3", 00:15:38.632 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:38.632 "is_configured": true, 00:15:38.632 "data_offset": 2048, 00:15:38.632 "data_size": 63488 00:15:38.632 } 00:15:38.632 ] 00:15:38.632 } 00:15:38.632 } 00:15:38.632 }' 00:15:38.632 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:38.632 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:38.632 pt2 00:15:38.632 pt3' 00:15:38.632 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.632 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:38.632 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.891 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.891 "name": "pt1", 00:15:38.891 "aliases": [ 00:15:38.891 "00000000-0000-0000-0000-000000000001" 00:15:38.891 ], 00:15:38.891 "product_name": "passthru", 00:15:38.891 "block_size": 512, 00:15:38.891 "num_blocks": 65536, 00:15:38.891 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.891 "assigned_rate_limits": { 00:15:38.891 "rw_ios_per_sec": 0, 00:15:38.891 "rw_mbytes_per_sec": 0, 00:15:38.891 "r_mbytes_per_sec": 0, 00:15:38.891 "w_mbytes_per_sec": 0 00:15:38.891 }, 00:15:38.891 "claimed": true, 00:15:38.891 "claim_type": "exclusive_write", 00:15:38.891 "zoned": false, 00:15:38.891 "supported_io_types": { 00:15:38.891 "read": true, 00:15:38.891 "write": true, 00:15:38.891 "unmap": true, 00:15:38.891 "flush": true, 00:15:38.891 "reset": true, 00:15:38.891 "nvme_admin": false, 00:15:38.891 "nvme_io": false, 00:15:38.891 "nvme_io_md": false, 00:15:38.891 "write_zeroes": true, 00:15:38.891 "zcopy": true, 00:15:38.891 "get_zone_info": false, 00:15:38.891 "zone_management": false, 00:15:38.891 "zone_append": false, 00:15:38.891 "compare": false, 00:15:38.891 "compare_and_write": false, 00:15:38.891 "abort": true, 00:15:38.891 "seek_hole": false, 00:15:38.891 "seek_data": false, 00:15:38.891 "copy": true, 00:15:38.891 "nvme_iov_md": false 00:15:38.891 }, 00:15:38.891 "memory_domains": [ 00:15:38.891 { 00:15:38.891 "dma_device_id": "system", 00:15:38.891 "dma_device_type": 1 00:15:38.891 }, 00:15:38.891 { 00:15:38.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.891 "dma_device_type": 2 00:15:38.891 } 00:15:38.891 ], 00:15:38.891 "driver_specific": { 00:15:38.891 "passthru": { 00:15:38.891 "name": "pt1", 00:15:38.891 "base_bdev_name": "malloc1" 00:15:38.891 } 00:15:38.891 } 00:15:38.891 }' 00:15:38.891 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.891 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.891 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:38.891 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.891 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.891 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:38.891 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.150 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.150 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.150 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.150 00:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.150 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.150 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.150 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:39.150 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.409 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.409 "name": "pt2", 00:15:39.409 "aliases": [ 00:15:39.409 "00000000-0000-0000-0000-000000000002" 00:15:39.409 ], 00:15:39.409 "product_name": "passthru", 00:15:39.409 "block_size": 512, 00:15:39.409 "num_blocks": 65536, 00:15:39.409 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:39.409 "assigned_rate_limits": { 00:15:39.409 "rw_ios_per_sec": 0, 00:15:39.409 "rw_mbytes_per_sec": 0, 00:15:39.409 "r_mbytes_per_sec": 0, 00:15:39.409 "w_mbytes_per_sec": 0 00:15:39.409 }, 00:15:39.409 "claimed": true, 00:15:39.409 "claim_type": "exclusive_write", 00:15:39.409 "zoned": false, 00:15:39.409 "supported_io_types": { 00:15:39.409 "read": true, 00:15:39.409 "write": true, 00:15:39.409 "unmap": true, 00:15:39.409 "flush": true, 00:15:39.409 "reset": true, 00:15:39.409 "nvme_admin": false, 00:15:39.409 "nvme_io": false, 00:15:39.409 "nvme_io_md": false, 00:15:39.409 "write_zeroes": true, 00:15:39.409 "zcopy": true, 00:15:39.409 "get_zone_info": false, 00:15:39.409 "zone_management": false, 00:15:39.409 "zone_append": false, 00:15:39.409 "compare": false, 00:15:39.409 "compare_and_write": false, 00:15:39.409 "abort": true, 00:15:39.409 "seek_hole": false, 00:15:39.409 "seek_data": false, 00:15:39.409 "copy": true, 00:15:39.409 "nvme_iov_md": false 00:15:39.409 }, 00:15:39.409 "memory_domains": [ 00:15:39.409 { 00:15:39.409 "dma_device_id": "system", 00:15:39.409 "dma_device_type": 1 00:15:39.409 }, 00:15:39.409 { 00:15:39.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.409 "dma_device_type": 2 00:15:39.409 } 00:15:39.409 ], 00:15:39.409 "driver_specific": { 00:15:39.409 "passthru": { 00:15:39.409 "name": "pt2", 00:15:39.409 "base_bdev_name": "malloc2" 00:15:39.410 } 00:15:39.410 } 00:15:39.410 }' 00:15:39.410 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.410 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.410 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.410 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.668 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.668 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.668 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.668 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.668 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.668 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.668 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.927 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.927 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.927 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:39.927 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:40.185 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:40.185 "name": "pt3", 00:15:40.185 "aliases": [ 00:15:40.185 "00000000-0000-0000-0000-000000000003" 00:15:40.185 ], 00:15:40.185 "product_name": "passthru", 00:15:40.185 "block_size": 512, 00:15:40.185 "num_blocks": 65536, 00:15:40.185 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:40.185 "assigned_rate_limits": { 00:15:40.185 "rw_ios_per_sec": 0, 00:15:40.185 "rw_mbytes_per_sec": 0, 00:15:40.185 "r_mbytes_per_sec": 0, 00:15:40.185 "w_mbytes_per_sec": 0 00:15:40.185 }, 00:15:40.185 "claimed": true, 00:15:40.185 "claim_type": "exclusive_write", 00:15:40.185 "zoned": false, 00:15:40.185 "supported_io_types": { 00:15:40.185 "read": true, 00:15:40.185 "write": true, 00:15:40.185 "unmap": true, 00:15:40.185 "flush": true, 00:15:40.185 "reset": true, 00:15:40.185 "nvme_admin": false, 00:15:40.185 "nvme_io": false, 00:15:40.185 "nvme_io_md": false, 00:15:40.185 "write_zeroes": true, 00:15:40.185 "zcopy": true, 00:15:40.185 "get_zone_info": false, 00:15:40.185 "zone_management": false, 00:15:40.185 "zone_append": false, 00:15:40.185 "compare": false, 00:15:40.185 "compare_and_write": false, 00:15:40.185 "abort": true, 00:15:40.185 "seek_hole": false, 00:15:40.185 "seek_data": false, 00:15:40.185 "copy": true, 00:15:40.185 "nvme_iov_md": false 00:15:40.185 }, 00:15:40.185 "memory_domains": [ 00:15:40.185 { 00:15:40.185 "dma_device_id": "system", 00:15:40.185 "dma_device_type": 1 00:15:40.185 }, 00:15:40.185 { 00:15:40.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.185 "dma_device_type": 2 00:15:40.185 } 00:15:40.185 ], 00:15:40.185 "driver_specific": { 00:15:40.186 "passthru": { 00:15:40.186 "name": "pt3", 00:15:40.186 "base_bdev_name": "malloc3" 00:15:40.186 } 00:15:40.186 } 00:15:40.186 }' 00:15:40.186 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.186 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.186 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:40.186 00:10:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.186 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.186 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.186 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.186 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.444 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.444 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.444 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.444 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.444 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:40.444 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:40.703 [2024-07-16 00:10:27.467131] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:40.703 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=40bb11c6-9f84-444a-a0f3-6d6f20a31c1a 00:15:40.703 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 40bb11c6-9f84-444a-a0f3-6d6f20a31c1a ']' 00:15:40.703 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:40.962 [2024-07-16 00:10:27.715500] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:40.962 [2024-07-16 00:10:27.715519] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:40.962 [2024-07-16 00:10:27.715569] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:40.962 [2024-07-16 00:10:27.715621] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:40.962 [2024-07-16 00:10:27.715632] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaa8ea0 name raid_bdev1, state offline 00:15:40.962 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.962 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:41.221 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:41.221 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:41.221 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:41.221 00:10:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:41.480 00:10:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:41.480 00:10:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:41.739 00:10:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:41.740 00:10:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:41.998 00:10:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:41.998 00:10:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:41.999 00:10:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:41.999 00:10:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:41.999 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:41.999 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:41.999 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.258 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:42.258 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.258 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:42.258 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.258 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:42.258 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:42.258 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:42.258 00:10:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:42.258 [2024-07-16 00:10:29.179318] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:42.258 [2024-07-16 00:10:29.180679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:42.258 [2024-07-16 00:10:29.180724] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:42.258 [2024-07-16 00:10:29.180771] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:42.258 [2024-07-16 00:10:29.180810] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:42.258 [2024-07-16 00:10:29.180833] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:42.258 [2024-07-16 00:10:29.180851] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:42.258 [2024-07-16 00:10:29.180861] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc53ff0 name raid_bdev1, state configuring 00:15:42.258 request: 00:15:42.258 { 00:15:42.258 "name": "raid_bdev1", 00:15:42.258 "raid_level": "concat", 00:15:42.258 "base_bdevs": [ 00:15:42.258 "malloc1", 00:15:42.258 "malloc2", 00:15:42.258 "malloc3" 00:15:42.258 ], 00:15:42.258 "strip_size_kb": 64, 00:15:42.258 "superblock": false, 00:15:42.258 "method": "bdev_raid_create", 00:15:42.258 "req_id": 1 00:15:42.258 } 00:15:42.258 Got JSON-RPC error response 00:15:42.258 response: 00:15:42.258 { 00:15:42.258 "code": -17, 00:15:42.258 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:42.258 } 00:15:42.258 00:10:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:42.258 00:10:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:42.258 00:10:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:42.258 00:10:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:42.258 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.258 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:42.517 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:42.517 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:42.517 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:42.775 [2024-07-16 00:10:29.668550] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:42.775 [2024-07-16 00:10:29.668610] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:42.775 [2024-07-16 00:10:29.668632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xab07a0 00:15:42.775 [2024-07-16 00:10:29.668645] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:42.775 [2024-07-16 00:10:29.670376] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:42.775 [2024-07-16 00:10:29.670409] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:42.775 [2024-07-16 00:10:29.670487] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:42.775 [2024-07-16 00:10:29.670514] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:42.775 pt1 00:15:42.775 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:42.775 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:42.776 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:42.776 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:42.776 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:42.776 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:42.776 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:42.776 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:42.776 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:42.776 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:42.776 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.776 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:43.034 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:43.034 "name": "raid_bdev1", 00:15:43.034 "uuid": "40bb11c6-9f84-444a-a0f3-6d6f20a31c1a", 00:15:43.034 "strip_size_kb": 64, 00:15:43.034 "state": "configuring", 00:15:43.034 "raid_level": "concat", 00:15:43.034 "superblock": true, 00:15:43.034 "num_base_bdevs": 3, 00:15:43.034 "num_base_bdevs_discovered": 1, 00:15:43.034 "num_base_bdevs_operational": 3, 00:15:43.034 "base_bdevs_list": [ 00:15:43.034 { 00:15:43.034 "name": "pt1", 00:15:43.034 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:43.034 "is_configured": true, 00:15:43.034 "data_offset": 2048, 00:15:43.034 "data_size": 63488 00:15:43.034 }, 00:15:43.034 { 00:15:43.034 "name": null, 00:15:43.034 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:43.034 "is_configured": false, 00:15:43.034 "data_offset": 2048, 00:15:43.034 "data_size": 63488 00:15:43.034 }, 00:15:43.034 { 00:15:43.035 "name": null, 00:15:43.035 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:43.035 "is_configured": false, 00:15:43.035 "data_offset": 2048, 00:15:43.035 "data_size": 63488 00:15:43.035 } 00:15:43.035 ] 00:15:43.035 }' 00:15:43.035 00:10:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:43.035 00:10:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.971 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:43.971 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:43.971 [2024-07-16 00:10:30.715334] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:43.971 [2024-07-16 00:10:30.715387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:43.971 [2024-07-16 00:10:30.715406] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaa7c70 00:15:43.971 [2024-07-16 00:10:30.715419] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:43.971 [2024-07-16 00:10:30.715786] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:43.971 [2024-07-16 00:10:30.715806] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:43.971 [2024-07-16 00:10:30.715872] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:43.971 [2024-07-16 00:10:30.715892] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:43.971 pt2 00:15:43.971 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:44.229 [2024-07-16 00:10:30.968014] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.229 00:10:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:44.487 00:10:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.487 "name": "raid_bdev1", 00:15:44.487 "uuid": "40bb11c6-9f84-444a-a0f3-6d6f20a31c1a", 00:15:44.487 "strip_size_kb": 64, 00:15:44.487 "state": "configuring", 00:15:44.487 "raid_level": "concat", 00:15:44.487 "superblock": true, 00:15:44.487 "num_base_bdevs": 3, 00:15:44.487 "num_base_bdevs_discovered": 1, 00:15:44.487 "num_base_bdevs_operational": 3, 00:15:44.487 "base_bdevs_list": [ 00:15:44.487 { 00:15:44.487 "name": "pt1", 00:15:44.487 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:44.487 "is_configured": true, 00:15:44.487 "data_offset": 2048, 00:15:44.487 "data_size": 63488 00:15:44.487 }, 00:15:44.487 { 00:15:44.487 "name": null, 00:15:44.487 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:44.487 "is_configured": false, 00:15:44.487 "data_offset": 2048, 00:15:44.487 "data_size": 63488 00:15:44.487 }, 00:15:44.487 { 00:15:44.487 "name": null, 00:15:44.487 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:44.487 "is_configured": false, 00:15:44.487 "data_offset": 2048, 00:15:44.487 "data_size": 63488 00:15:44.487 } 00:15:44.487 ] 00:15:44.487 }' 00:15:44.487 00:10:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.487 00:10:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.053 00:10:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:45.053 00:10:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:45.053 00:10:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:45.311 [2024-07-16 00:10:32.066942] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:45.311 [2024-07-16 00:10:32.066992] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:45.311 [2024-07-16 00:10:32.067013] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xab0a10 00:15:45.311 [2024-07-16 00:10:32.067026] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:45.311 [2024-07-16 00:10:32.067375] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:45.311 [2024-07-16 00:10:32.067394] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:45.311 [2024-07-16 00:10:32.067458] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:45.311 [2024-07-16 00:10:32.067477] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:45.311 pt2 00:15:45.311 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:45.311 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:45.311 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:45.569 [2024-07-16 00:10:32.315591] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:45.569 [2024-07-16 00:10:32.315628] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:45.569 [2024-07-16 00:10:32.315643] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4a740 00:15:45.569 [2024-07-16 00:10:32.315655] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:45.569 [2024-07-16 00:10:32.315951] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:45.569 [2024-07-16 00:10:32.315969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:45.569 [2024-07-16 00:10:32.316022] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:45.569 [2024-07-16 00:10:32.316039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:45.569 [2024-07-16 00:10:32.316142] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc4ac00 00:15:45.569 [2024-07-16 00:10:32.316153] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:45.569 [2024-07-16 00:10:32.316316] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaafa40 00:15:45.569 [2024-07-16 00:10:32.316436] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc4ac00 00:15:45.569 [2024-07-16 00:10:32.316445] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc4ac00 00:15:45.569 [2024-07-16 00:10:32.316538] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:45.569 pt3 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.569 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:45.888 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.888 "name": "raid_bdev1", 00:15:45.888 "uuid": "40bb11c6-9f84-444a-a0f3-6d6f20a31c1a", 00:15:45.888 "strip_size_kb": 64, 00:15:45.888 "state": "online", 00:15:45.888 "raid_level": "concat", 00:15:45.888 "superblock": true, 00:15:45.888 "num_base_bdevs": 3, 00:15:45.888 "num_base_bdevs_discovered": 3, 00:15:45.888 "num_base_bdevs_operational": 3, 00:15:45.888 "base_bdevs_list": [ 00:15:45.888 { 00:15:45.888 "name": "pt1", 00:15:45.888 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:45.888 "is_configured": true, 00:15:45.888 "data_offset": 2048, 00:15:45.888 "data_size": 63488 00:15:45.888 }, 00:15:45.888 { 00:15:45.888 "name": "pt2", 00:15:45.888 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:45.888 "is_configured": true, 00:15:45.888 "data_offset": 2048, 00:15:45.888 "data_size": 63488 00:15:45.888 }, 00:15:45.888 { 00:15:45.888 "name": "pt3", 00:15:45.888 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:45.888 "is_configured": true, 00:15:45.888 "data_offset": 2048, 00:15:45.888 "data_size": 63488 00:15:45.888 } 00:15:45.888 ] 00:15:45.888 }' 00:15:45.888 00:10:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.888 00:10:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.463 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:46.464 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:46.464 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:46.464 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:46.464 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:46.464 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:46.464 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:46.464 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:46.464 [2024-07-16 00:10:33.402884] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:46.721 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:46.721 "name": "raid_bdev1", 00:15:46.721 "aliases": [ 00:15:46.721 "40bb11c6-9f84-444a-a0f3-6d6f20a31c1a" 00:15:46.721 ], 00:15:46.721 "product_name": "Raid Volume", 00:15:46.721 "block_size": 512, 00:15:46.721 "num_blocks": 190464, 00:15:46.721 "uuid": "40bb11c6-9f84-444a-a0f3-6d6f20a31c1a", 00:15:46.721 "assigned_rate_limits": { 00:15:46.721 "rw_ios_per_sec": 0, 00:15:46.721 "rw_mbytes_per_sec": 0, 00:15:46.721 "r_mbytes_per_sec": 0, 00:15:46.721 "w_mbytes_per_sec": 0 00:15:46.721 }, 00:15:46.721 "claimed": false, 00:15:46.721 "zoned": false, 00:15:46.721 "supported_io_types": { 00:15:46.721 "read": true, 00:15:46.721 "write": true, 00:15:46.721 "unmap": true, 00:15:46.721 "flush": true, 00:15:46.721 "reset": true, 00:15:46.721 "nvme_admin": false, 00:15:46.721 "nvme_io": false, 00:15:46.721 "nvme_io_md": false, 00:15:46.721 "write_zeroes": true, 00:15:46.721 "zcopy": false, 00:15:46.721 "get_zone_info": false, 00:15:46.721 "zone_management": false, 00:15:46.721 "zone_append": false, 00:15:46.721 "compare": false, 00:15:46.721 "compare_and_write": false, 00:15:46.721 "abort": false, 00:15:46.721 "seek_hole": false, 00:15:46.721 "seek_data": false, 00:15:46.721 "copy": false, 00:15:46.721 "nvme_iov_md": false 00:15:46.721 }, 00:15:46.721 "memory_domains": [ 00:15:46.721 { 00:15:46.721 "dma_device_id": "system", 00:15:46.721 "dma_device_type": 1 00:15:46.721 }, 00:15:46.721 { 00:15:46.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.722 "dma_device_type": 2 00:15:46.722 }, 00:15:46.722 { 00:15:46.722 "dma_device_id": "system", 00:15:46.722 "dma_device_type": 1 00:15:46.722 }, 00:15:46.722 { 00:15:46.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.722 "dma_device_type": 2 00:15:46.722 }, 00:15:46.722 { 00:15:46.722 "dma_device_id": "system", 00:15:46.722 "dma_device_type": 1 00:15:46.722 }, 00:15:46.722 { 00:15:46.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.722 "dma_device_type": 2 00:15:46.722 } 00:15:46.722 ], 00:15:46.722 "driver_specific": { 00:15:46.722 "raid": { 00:15:46.722 "uuid": "40bb11c6-9f84-444a-a0f3-6d6f20a31c1a", 00:15:46.722 "strip_size_kb": 64, 00:15:46.722 "state": "online", 00:15:46.722 "raid_level": "concat", 00:15:46.722 "superblock": true, 00:15:46.722 "num_base_bdevs": 3, 00:15:46.722 "num_base_bdevs_discovered": 3, 00:15:46.722 "num_base_bdevs_operational": 3, 00:15:46.722 "base_bdevs_list": [ 00:15:46.722 { 00:15:46.722 "name": "pt1", 00:15:46.722 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:46.722 "is_configured": true, 00:15:46.722 "data_offset": 2048, 00:15:46.722 "data_size": 63488 00:15:46.722 }, 00:15:46.722 { 00:15:46.722 "name": "pt2", 00:15:46.722 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:46.722 "is_configured": true, 00:15:46.722 "data_offset": 2048, 00:15:46.722 "data_size": 63488 00:15:46.722 }, 00:15:46.722 { 00:15:46.722 "name": "pt3", 00:15:46.722 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:46.722 "is_configured": true, 00:15:46.722 "data_offset": 2048, 00:15:46.722 "data_size": 63488 00:15:46.722 } 00:15:46.722 ] 00:15:46.722 } 00:15:46.722 } 00:15:46.722 }' 00:15:46.722 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:46.722 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:46.722 pt2 00:15:46.722 pt3' 00:15:46.722 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:46.722 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:46.722 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:46.979 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:46.979 "name": "pt1", 00:15:46.979 "aliases": [ 00:15:46.979 "00000000-0000-0000-0000-000000000001" 00:15:46.979 ], 00:15:46.979 "product_name": "passthru", 00:15:46.979 "block_size": 512, 00:15:46.979 "num_blocks": 65536, 00:15:46.979 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:46.979 "assigned_rate_limits": { 00:15:46.979 "rw_ios_per_sec": 0, 00:15:46.979 "rw_mbytes_per_sec": 0, 00:15:46.979 "r_mbytes_per_sec": 0, 00:15:46.979 "w_mbytes_per_sec": 0 00:15:46.979 }, 00:15:46.979 "claimed": true, 00:15:46.979 "claim_type": "exclusive_write", 00:15:46.979 "zoned": false, 00:15:46.979 "supported_io_types": { 00:15:46.979 "read": true, 00:15:46.979 "write": true, 00:15:46.979 "unmap": true, 00:15:46.979 "flush": true, 00:15:46.979 "reset": true, 00:15:46.979 "nvme_admin": false, 00:15:46.979 "nvme_io": false, 00:15:46.979 "nvme_io_md": false, 00:15:46.979 "write_zeroes": true, 00:15:46.979 "zcopy": true, 00:15:46.979 "get_zone_info": false, 00:15:46.979 "zone_management": false, 00:15:46.979 "zone_append": false, 00:15:46.979 "compare": false, 00:15:46.979 "compare_and_write": false, 00:15:46.979 "abort": true, 00:15:46.979 "seek_hole": false, 00:15:46.979 "seek_data": false, 00:15:46.979 "copy": true, 00:15:46.979 "nvme_iov_md": false 00:15:46.979 }, 00:15:46.979 "memory_domains": [ 00:15:46.979 { 00:15:46.979 "dma_device_id": "system", 00:15:46.979 "dma_device_type": 1 00:15:46.979 }, 00:15:46.979 { 00:15:46.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.980 "dma_device_type": 2 00:15:46.980 } 00:15:46.980 ], 00:15:46.980 "driver_specific": { 00:15:46.980 "passthru": { 00:15:46.980 "name": "pt1", 00:15:46.980 "base_bdev_name": "malloc1" 00:15:46.980 } 00:15:46.980 } 00:15:46.980 }' 00:15:46.980 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.980 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.980 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:46.980 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.980 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.980 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:46.980 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.238 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.238 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:47.238 00:10:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.238 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.238 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:47.238 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:47.238 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:47.238 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:47.496 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:47.496 "name": "pt2", 00:15:47.496 "aliases": [ 00:15:47.496 "00000000-0000-0000-0000-000000000002" 00:15:47.496 ], 00:15:47.496 "product_name": "passthru", 00:15:47.496 "block_size": 512, 00:15:47.496 "num_blocks": 65536, 00:15:47.496 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:47.496 "assigned_rate_limits": { 00:15:47.496 "rw_ios_per_sec": 0, 00:15:47.496 "rw_mbytes_per_sec": 0, 00:15:47.496 "r_mbytes_per_sec": 0, 00:15:47.496 "w_mbytes_per_sec": 0 00:15:47.496 }, 00:15:47.496 "claimed": true, 00:15:47.496 "claim_type": "exclusive_write", 00:15:47.496 "zoned": false, 00:15:47.496 "supported_io_types": { 00:15:47.496 "read": true, 00:15:47.496 "write": true, 00:15:47.496 "unmap": true, 00:15:47.496 "flush": true, 00:15:47.496 "reset": true, 00:15:47.496 "nvme_admin": false, 00:15:47.496 "nvme_io": false, 00:15:47.496 "nvme_io_md": false, 00:15:47.496 "write_zeroes": true, 00:15:47.496 "zcopy": true, 00:15:47.496 "get_zone_info": false, 00:15:47.496 "zone_management": false, 00:15:47.496 "zone_append": false, 00:15:47.496 "compare": false, 00:15:47.496 "compare_and_write": false, 00:15:47.496 "abort": true, 00:15:47.496 "seek_hole": false, 00:15:47.496 "seek_data": false, 00:15:47.496 "copy": true, 00:15:47.496 "nvme_iov_md": false 00:15:47.496 }, 00:15:47.496 "memory_domains": [ 00:15:47.496 { 00:15:47.496 "dma_device_id": "system", 00:15:47.496 "dma_device_type": 1 00:15:47.496 }, 00:15:47.496 { 00:15:47.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.496 "dma_device_type": 2 00:15:47.496 } 00:15:47.496 ], 00:15:47.496 "driver_specific": { 00:15:47.496 "passthru": { 00:15:47.496 "name": "pt2", 00:15:47.496 "base_bdev_name": "malloc2" 00:15:47.496 } 00:15:47.496 } 00:15:47.496 }' 00:15:47.496 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.496 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.496 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:47.496 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:47.754 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:48.012 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:48.012 "name": "pt3", 00:15:48.012 "aliases": [ 00:15:48.012 "00000000-0000-0000-0000-000000000003" 00:15:48.012 ], 00:15:48.012 "product_name": "passthru", 00:15:48.012 "block_size": 512, 00:15:48.012 "num_blocks": 65536, 00:15:48.012 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:48.012 "assigned_rate_limits": { 00:15:48.012 "rw_ios_per_sec": 0, 00:15:48.012 "rw_mbytes_per_sec": 0, 00:15:48.012 "r_mbytes_per_sec": 0, 00:15:48.012 "w_mbytes_per_sec": 0 00:15:48.012 }, 00:15:48.012 "claimed": true, 00:15:48.012 "claim_type": "exclusive_write", 00:15:48.012 "zoned": false, 00:15:48.012 "supported_io_types": { 00:15:48.012 "read": true, 00:15:48.012 "write": true, 00:15:48.012 "unmap": true, 00:15:48.012 "flush": true, 00:15:48.012 "reset": true, 00:15:48.012 "nvme_admin": false, 00:15:48.012 "nvme_io": false, 00:15:48.012 "nvme_io_md": false, 00:15:48.012 "write_zeroes": true, 00:15:48.012 "zcopy": true, 00:15:48.012 "get_zone_info": false, 00:15:48.012 "zone_management": false, 00:15:48.012 "zone_append": false, 00:15:48.012 "compare": false, 00:15:48.012 "compare_and_write": false, 00:15:48.012 "abort": true, 00:15:48.012 "seek_hole": false, 00:15:48.012 "seek_data": false, 00:15:48.012 "copy": true, 00:15:48.012 "nvme_iov_md": false 00:15:48.012 }, 00:15:48.012 "memory_domains": [ 00:15:48.012 { 00:15:48.012 "dma_device_id": "system", 00:15:48.012 "dma_device_type": 1 00:15:48.012 }, 00:15:48.012 { 00:15:48.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:48.012 "dma_device_type": 2 00:15:48.012 } 00:15:48.012 ], 00:15:48.012 "driver_specific": { 00:15:48.012 "passthru": { 00:15:48.012 "name": "pt3", 00:15:48.012 "base_bdev_name": "malloc3" 00:15:48.012 } 00:15:48.012 } 00:15:48.012 }' 00:15:48.012 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:48.270 00:10:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:48.270 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:48.270 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:48.270 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:48.270 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:48.270 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:48.270 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:48.270 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:48.270 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:48.528 [2024-07-16 00:10:35.440314] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 40bb11c6-9f84-444a-a0f3-6d6f20a31c1a '!=' 40bb11c6-9f84-444a-a0f3-6d6f20a31c1a ']' 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3531188 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3531188 ']' 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3531188 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:48.528 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3531188 00:15:48.787 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:48.787 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:48.787 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3531188' 00:15:48.787 killing process with pid 3531188 00:15:48.787 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3531188 00:15:48.787 [2024-07-16 00:10:35.513261] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:48.787 [2024-07-16 00:10:35.513322] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:48.787 [2024-07-16 00:10:35.513379] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:48.787 [2024-07-16 00:10:35.513392] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc4ac00 name raid_bdev1, state offline 00:15:48.787 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3531188 00:15:48.787 [2024-07-16 00:10:35.540890] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:49.046 00:10:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:49.046 00:15:49.046 real 0m13.576s 00:15:49.046 user 0m24.290s 00:15:49.046 sys 0m2.596s 00:15:49.046 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:49.046 00:10:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.046 ************************************ 00:15:49.046 END TEST raid_superblock_test 00:15:49.046 ************************************ 00:15:49.046 00:10:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:49.046 00:10:35 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:15:49.046 00:10:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:49.046 00:10:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:49.046 00:10:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:49.046 ************************************ 00:15:49.046 START TEST raid_read_error_test 00:15:49.046 ************************************ 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.joDw2LwJF0 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3533234 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3533234 /var/tmp/spdk-raid.sock 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3533234 ']' 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:49.046 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:49.046 00:10:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.046 [2024-07-16 00:10:35.932728] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:15:49.046 [2024-07-16 00:10:35.932797] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3533234 ] 00:15:49.305 [2024-07-16 00:10:36.060773] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:49.305 [2024-07-16 00:10:36.164820] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:49.305 [2024-07-16 00:10:36.224892] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:49.305 [2024-07-16 00:10:36.224937] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:50.240 00:10:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:50.240 00:10:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:50.240 00:10:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:50.240 00:10:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:50.240 BaseBdev1_malloc 00:15:50.240 00:10:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:50.499 true 00:15:50.499 00:10:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:50.757 [2024-07-16 00:10:37.582535] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:50.757 [2024-07-16 00:10:37.582582] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:50.757 [2024-07-16 00:10:37.582603] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf360d0 00:15:50.758 [2024-07-16 00:10:37.582617] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:50.758 [2024-07-16 00:10:37.584513] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:50.758 [2024-07-16 00:10:37.584545] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:50.758 BaseBdev1 00:15:50.758 00:10:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:50.758 00:10:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:51.015 BaseBdev2_malloc 00:15:51.015 00:10:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:51.273 true 00:15:51.273 00:10:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:51.531 [2024-07-16 00:10:38.317079] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:51.531 [2024-07-16 00:10:38.317126] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:51.532 [2024-07-16 00:10:38.317147] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf3a910 00:15:51.532 [2024-07-16 00:10:38.317159] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:51.532 [2024-07-16 00:10:38.318771] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:51.532 [2024-07-16 00:10:38.318802] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:51.532 BaseBdev2 00:15:51.532 00:10:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:51.532 00:10:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:51.789 BaseBdev3_malloc 00:15:51.789 00:10:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:52.047 true 00:15:52.047 00:10:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:52.306 [2024-07-16 00:10:39.052842] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:52.306 [2024-07-16 00:10:39.052889] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:52.306 [2024-07-16 00:10:39.052910] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf3cbd0 00:15:52.306 [2024-07-16 00:10:39.052934] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:52.306 [2024-07-16 00:10:39.054542] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:52.306 [2024-07-16 00:10:39.054573] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:52.306 BaseBdev3 00:15:52.306 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:52.565 [2024-07-16 00:10:39.297516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:52.565 [2024-07-16 00:10:39.298859] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:52.565 [2024-07-16 00:10:39.298938] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:52.565 [2024-07-16 00:10:39.299153] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf3e280 00:15:52.565 [2024-07-16 00:10:39.299164] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:52.565 [2024-07-16 00:10:39.299360] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf3de20 00:15:52.565 [2024-07-16 00:10:39.299505] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf3e280 00:15:52.565 [2024-07-16 00:10:39.299515] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf3e280 00:15:52.565 [2024-07-16 00:10:39.299617] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.565 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:52.823 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.824 "name": "raid_bdev1", 00:15:52.824 "uuid": "1b1d7771-388a-4003-8c07-3e81f14ac78d", 00:15:52.824 "strip_size_kb": 64, 00:15:52.824 "state": "online", 00:15:52.824 "raid_level": "concat", 00:15:52.824 "superblock": true, 00:15:52.824 "num_base_bdevs": 3, 00:15:52.824 "num_base_bdevs_discovered": 3, 00:15:52.824 "num_base_bdevs_operational": 3, 00:15:52.824 "base_bdevs_list": [ 00:15:52.824 { 00:15:52.824 "name": "BaseBdev1", 00:15:52.824 "uuid": "9a01ab50-f2d7-59da-9065-197586d6143a", 00:15:52.824 "is_configured": true, 00:15:52.824 "data_offset": 2048, 00:15:52.824 "data_size": 63488 00:15:52.824 }, 00:15:52.824 { 00:15:52.824 "name": "BaseBdev2", 00:15:52.824 "uuid": "7738076c-a247-5cac-95ad-ccf42a79417a", 00:15:52.824 "is_configured": true, 00:15:52.824 "data_offset": 2048, 00:15:52.824 "data_size": 63488 00:15:52.824 }, 00:15:52.824 { 00:15:52.824 "name": "BaseBdev3", 00:15:52.824 "uuid": "cf5268c3-3206-5530-a6c6-2720745a8312", 00:15:52.824 "is_configured": true, 00:15:52.824 "data_offset": 2048, 00:15:52.824 "data_size": 63488 00:15:52.824 } 00:15:52.824 ] 00:15:52.824 }' 00:15:52.824 00:10:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.824 00:10:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:53.392 00:10:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:53.392 00:10:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:53.392 [2024-07-16 00:10:40.220260] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd8c4d0 00:15:54.329 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.590 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:54.849 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:54.849 "name": "raid_bdev1", 00:15:54.849 "uuid": "1b1d7771-388a-4003-8c07-3e81f14ac78d", 00:15:54.849 "strip_size_kb": 64, 00:15:54.849 "state": "online", 00:15:54.849 "raid_level": "concat", 00:15:54.849 "superblock": true, 00:15:54.849 "num_base_bdevs": 3, 00:15:54.849 "num_base_bdevs_discovered": 3, 00:15:54.849 "num_base_bdevs_operational": 3, 00:15:54.849 "base_bdevs_list": [ 00:15:54.849 { 00:15:54.849 "name": "BaseBdev1", 00:15:54.849 "uuid": "9a01ab50-f2d7-59da-9065-197586d6143a", 00:15:54.849 "is_configured": true, 00:15:54.849 "data_offset": 2048, 00:15:54.849 "data_size": 63488 00:15:54.849 }, 00:15:54.849 { 00:15:54.849 "name": "BaseBdev2", 00:15:54.849 "uuid": "7738076c-a247-5cac-95ad-ccf42a79417a", 00:15:54.849 "is_configured": true, 00:15:54.849 "data_offset": 2048, 00:15:54.849 "data_size": 63488 00:15:54.849 }, 00:15:54.849 { 00:15:54.849 "name": "BaseBdev3", 00:15:54.849 "uuid": "cf5268c3-3206-5530-a6c6-2720745a8312", 00:15:54.849 "is_configured": true, 00:15:54.849 "data_offset": 2048, 00:15:54.849 "data_size": 63488 00:15:54.849 } 00:15:54.849 ] 00:15:54.849 }' 00:15:54.849 00:10:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:54.849 00:10:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.416 00:10:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:55.674 [2024-07-16 00:10:42.493882] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:55.674 [2024-07-16 00:10:42.493937] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:55.675 [2024-07-16 00:10:42.497099] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:55.675 [2024-07-16 00:10:42.497134] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:55.675 [2024-07-16 00:10:42.497168] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:55.675 [2024-07-16 00:10:42.497179] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf3e280 name raid_bdev1, state offline 00:15:55.675 0 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3533234 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3533234 ']' 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3533234 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3533234 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3533234' 00:15:55.675 killing process with pid 3533234 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3533234 00:15:55.675 [2024-07-16 00:10:42.579880] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:55.675 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3533234 00:15:55.675 [2024-07-16 00:10:42.601300] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:55.934 00:10:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.joDw2LwJF0 00:15:55.934 00:10:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:55.934 00:10:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:55.934 00:10:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:15:55.934 00:10:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:55.934 00:10:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:55.934 00:10:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:55.934 00:10:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:15:55.934 00:15:55.934 real 0m6.985s 00:15:55.934 user 0m11.096s 00:15:55.934 sys 0m1.186s 00:15:55.934 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:55.934 00:10:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.934 ************************************ 00:15:55.934 END TEST raid_read_error_test 00:15:55.934 ************************************ 00:15:55.934 00:10:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:55.934 00:10:42 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:15:55.934 00:10:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:55.934 00:10:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:55.934 00:10:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:56.192 ************************************ 00:15:56.192 START TEST raid_write_error_test 00:15:56.192 ************************************ 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ypPPCOevPg 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3534217 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3534217 /var/tmp/spdk-raid.sock 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3534217 ']' 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:56.193 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:56.193 00:10:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.193 [2024-07-16 00:10:43.004539] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:15:56.193 [2024-07-16 00:10:43.004622] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3534217 ] 00:15:56.453 [2024-07-16 00:10:43.150528] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:56.453 [2024-07-16 00:10:43.254468] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:56.453 [2024-07-16 00:10:43.317791] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:56.453 [2024-07-16 00:10:43.317827] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:57.020 00:10:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:57.020 00:10:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:57.020 00:10:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:57.020 00:10:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:57.279 BaseBdev1_malloc 00:15:57.279 00:10:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:57.538 true 00:15:57.538 00:10:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:57.797 [2024-07-16 00:10:44.626984] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:57.797 [2024-07-16 00:10:44.627031] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:57.797 [2024-07-16 00:10:44.627053] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14bb0d0 00:15:57.797 [2024-07-16 00:10:44.627066] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:57.797 [2024-07-16 00:10:44.628963] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:57.797 [2024-07-16 00:10:44.628996] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:57.797 BaseBdev1 00:15:57.797 00:10:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:57.797 00:10:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:58.056 BaseBdev2_malloc 00:15:58.056 00:10:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:58.314 true 00:15:58.314 00:10:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:58.573 [2024-07-16 00:10:45.365500] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:58.573 [2024-07-16 00:10:45.365544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:58.573 [2024-07-16 00:10:45.365567] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14bf910 00:15:58.573 [2024-07-16 00:10:45.365580] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:58.573 [2024-07-16 00:10:45.367164] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:58.573 [2024-07-16 00:10:45.367194] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:58.573 BaseBdev2 00:15:58.573 00:10:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:58.573 00:10:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:58.832 BaseBdev3_malloc 00:15:58.832 00:10:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:59.091 true 00:15:59.091 00:10:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:59.350 [2024-07-16 00:10:46.149094] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:59.350 [2024-07-16 00:10:46.149140] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:59.350 [2024-07-16 00:10:46.149164] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c1bd0 00:15:59.350 [2024-07-16 00:10:46.149177] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:59.350 [2024-07-16 00:10:46.150765] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:59.350 [2024-07-16 00:10:46.150795] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:59.350 BaseBdev3 00:15:59.350 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:59.609 [2024-07-16 00:10:46.445905] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:59.609 [2024-07-16 00:10:46.447229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:59.609 [2024-07-16 00:10:46.447298] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:59.609 [2024-07-16 00:10:46.447500] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14c3280 00:15:59.609 [2024-07-16 00:10:46.447512] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:59.609 [2024-07-16 00:10:46.447707] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c2e20 00:15:59.609 [2024-07-16 00:10:46.447852] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14c3280 00:15:59.609 [2024-07-16 00:10:46.447862] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14c3280 00:15:59.609 [2024-07-16 00:10:46.448020] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.609 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:59.868 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.868 "name": "raid_bdev1", 00:15:59.868 "uuid": "5a7b0aa3-1273-4dc0-83a9-8869120d237c", 00:15:59.868 "strip_size_kb": 64, 00:15:59.868 "state": "online", 00:15:59.868 "raid_level": "concat", 00:15:59.868 "superblock": true, 00:15:59.868 "num_base_bdevs": 3, 00:15:59.868 "num_base_bdevs_discovered": 3, 00:15:59.868 "num_base_bdevs_operational": 3, 00:15:59.868 "base_bdevs_list": [ 00:15:59.868 { 00:15:59.868 "name": "BaseBdev1", 00:15:59.868 "uuid": "53486696-34ca-5c64-ab6e-0dd2a70ef922", 00:15:59.868 "is_configured": true, 00:15:59.868 "data_offset": 2048, 00:15:59.868 "data_size": 63488 00:15:59.868 }, 00:15:59.868 { 00:15:59.868 "name": "BaseBdev2", 00:15:59.868 "uuid": "7e0ed3b1-d19e-5c1e-bda4-0a254d65eebe", 00:15:59.868 "is_configured": true, 00:15:59.868 "data_offset": 2048, 00:15:59.868 "data_size": 63488 00:15:59.868 }, 00:15:59.868 { 00:15:59.868 "name": "BaseBdev3", 00:15:59.868 "uuid": "50ab94fc-6b24-52a5-9496-d60174748352", 00:15:59.868 "is_configured": true, 00:15:59.868 "data_offset": 2048, 00:15:59.868 "data_size": 63488 00:15:59.868 } 00:15:59.868 ] 00:15:59.868 }' 00:15:59.868 00:10:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.868 00:10:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.804 00:10:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:00.804 00:10:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:01.063 [2024-07-16 00:10:47.757637] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13114d0 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.000 00:10:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:02.259 00:10:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.259 "name": "raid_bdev1", 00:16:02.259 "uuid": "5a7b0aa3-1273-4dc0-83a9-8869120d237c", 00:16:02.259 "strip_size_kb": 64, 00:16:02.259 "state": "online", 00:16:02.259 "raid_level": "concat", 00:16:02.259 "superblock": true, 00:16:02.259 "num_base_bdevs": 3, 00:16:02.259 "num_base_bdevs_discovered": 3, 00:16:02.259 "num_base_bdevs_operational": 3, 00:16:02.259 "base_bdevs_list": [ 00:16:02.259 { 00:16:02.259 "name": "BaseBdev1", 00:16:02.259 "uuid": "53486696-34ca-5c64-ab6e-0dd2a70ef922", 00:16:02.259 "is_configured": true, 00:16:02.259 "data_offset": 2048, 00:16:02.259 "data_size": 63488 00:16:02.259 }, 00:16:02.259 { 00:16:02.259 "name": "BaseBdev2", 00:16:02.259 "uuid": "7e0ed3b1-d19e-5c1e-bda4-0a254d65eebe", 00:16:02.259 "is_configured": true, 00:16:02.259 "data_offset": 2048, 00:16:02.259 "data_size": 63488 00:16:02.259 }, 00:16:02.259 { 00:16:02.259 "name": "BaseBdev3", 00:16:02.259 "uuid": "50ab94fc-6b24-52a5-9496-d60174748352", 00:16:02.259 "is_configured": true, 00:16:02.259 "data_offset": 2048, 00:16:02.259 "data_size": 63488 00:16:02.259 } 00:16:02.259 ] 00:16:02.259 }' 00:16:02.259 00:10:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.259 00:10:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.890 00:10:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:03.164 [2024-07-16 00:10:49.933818] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:03.164 [2024-07-16 00:10:49.933863] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:03.164 [2024-07-16 00:10:49.937033] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:03.164 [2024-07-16 00:10:49.937070] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:03.164 [2024-07-16 00:10:49.937104] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:03.164 [2024-07-16 00:10:49.937115] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c3280 name raid_bdev1, state offline 00:16:03.164 0 00:16:03.164 00:10:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3534217 00:16:03.164 00:10:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3534217 ']' 00:16:03.164 00:10:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3534217 00:16:03.164 00:10:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:03.164 00:10:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:03.164 00:10:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3534217 00:16:03.164 00:10:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:03.164 00:10:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:03.164 00:10:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3534217' 00:16:03.164 killing process with pid 3534217 00:16:03.164 00:10:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3534217 00:16:03.164 [2024-07-16 00:10:50.018721] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:03.164 00:10:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3534217 00:16:03.164 [2024-07-16 00:10:50.039365] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:03.424 00:10:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ypPPCOevPg 00:16:03.424 00:10:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:03.424 00:10:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:03.424 00:10:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:16:03.424 00:10:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:03.424 00:10:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:03.424 00:10:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:03.424 00:10:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:16:03.424 00:16:03.424 real 0m7.346s 00:16:03.424 user 0m11.766s 00:16:03.424 sys 0m1.280s 00:16:03.424 00:10:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:03.424 00:10:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:03.424 ************************************ 00:16:03.424 END TEST raid_write_error_test 00:16:03.424 ************************************ 00:16:03.424 00:10:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:03.424 00:10:50 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:03.424 00:10:50 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:16:03.424 00:10:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:03.424 00:10:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:03.424 00:10:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:03.424 ************************************ 00:16:03.424 START TEST raid_state_function_test 00:16:03.424 ************************************ 00:16:03.424 00:10:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:16:03.424 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:03.424 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:03.424 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:03.424 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:03.424 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3535359 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3535359' 00:16:03.425 Process raid pid: 3535359 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3535359 /var/tmp/spdk-raid.sock 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3535359 ']' 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:03.425 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:03.425 00:10:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:03.684 [2024-07-16 00:10:50.421713] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:16:03.684 [2024-07-16 00:10:50.421781] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:03.684 [2024-07-16 00:10:50.553028] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:03.943 [2024-07-16 00:10:50.655860] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:03.943 [2024-07-16 00:10:50.715216] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:03.943 [2024-07-16 00:10:50.715253] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:04.882 00:10:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:04.882 00:10:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:04.882 00:10:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:05.455 [2024-07-16 00:10:52.105540] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:05.455 [2024-07-16 00:10:52.105582] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:05.455 [2024-07-16 00:10:52.105593] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:05.455 [2024-07-16 00:10:52.105610] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:05.455 [2024-07-16 00:10:52.105619] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:05.455 [2024-07-16 00:10:52.105632] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.455 "name": "Existed_Raid", 00:16:05.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.455 "strip_size_kb": 0, 00:16:05.455 "state": "configuring", 00:16:05.455 "raid_level": "raid1", 00:16:05.455 "superblock": false, 00:16:05.455 "num_base_bdevs": 3, 00:16:05.455 "num_base_bdevs_discovered": 0, 00:16:05.455 "num_base_bdevs_operational": 3, 00:16:05.455 "base_bdevs_list": [ 00:16:05.455 { 00:16:05.455 "name": "BaseBdev1", 00:16:05.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.455 "is_configured": false, 00:16:05.455 "data_offset": 0, 00:16:05.455 "data_size": 0 00:16:05.455 }, 00:16:05.455 { 00:16:05.455 "name": "BaseBdev2", 00:16:05.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.455 "is_configured": false, 00:16:05.455 "data_offset": 0, 00:16:05.455 "data_size": 0 00:16:05.455 }, 00:16:05.455 { 00:16:05.455 "name": "BaseBdev3", 00:16:05.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.455 "is_configured": false, 00:16:05.455 "data_offset": 0, 00:16:05.455 "data_size": 0 00:16:05.455 } 00:16:05.455 ] 00:16:05.455 }' 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.455 00:10:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.387 00:10:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:06.645 [2024-07-16 00:10:53.489070] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:06.645 [2024-07-16 00:10:53.489100] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24c8a80 name Existed_Raid, state configuring 00:16:06.645 00:10:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:06.904 [2024-07-16 00:10:53.737726] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:06.904 [2024-07-16 00:10:53.737754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:06.904 [2024-07-16 00:10:53.737764] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:06.904 [2024-07-16 00:10:53.737775] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:06.904 [2024-07-16 00:10:53.737784] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:06.904 [2024-07-16 00:10:53.737802] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:06.904 00:10:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:07.163 [2024-07-16 00:10:53.988269] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:07.163 BaseBdev1 00:16:07.163 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:07.163 00:10:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:07.163 00:10:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:07.163 00:10:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:07.163 00:10:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:07.163 00:10:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:07.163 00:10:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:07.730 00:10:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:07.987 [ 00:16:07.987 { 00:16:07.987 "name": "BaseBdev1", 00:16:07.987 "aliases": [ 00:16:07.987 "7ba5a6fd-0b0e-466f-9a7e-7a50856a68f4" 00:16:07.988 ], 00:16:07.988 "product_name": "Malloc disk", 00:16:07.988 "block_size": 512, 00:16:07.988 "num_blocks": 65536, 00:16:07.988 "uuid": "7ba5a6fd-0b0e-466f-9a7e-7a50856a68f4", 00:16:07.988 "assigned_rate_limits": { 00:16:07.988 "rw_ios_per_sec": 0, 00:16:07.988 "rw_mbytes_per_sec": 0, 00:16:07.988 "r_mbytes_per_sec": 0, 00:16:07.988 "w_mbytes_per_sec": 0 00:16:07.988 }, 00:16:07.988 "claimed": true, 00:16:07.988 "claim_type": "exclusive_write", 00:16:07.988 "zoned": false, 00:16:07.988 "supported_io_types": { 00:16:07.988 "read": true, 00:16:07.988 "write": true, 00:16:07.988 "unmap": true, 00:16:07.988 "flush": true, 00:16:07.988 "reset": true, 00:16:07.988 "nvme_admin": false, 00:16:07.988 "nvme_io": false, 00:16:07.988 "nvme_io_md": false, 00:16:07.988 "write_zeroes": true, 00:16:07.988 "zcopy": true, 00:16:07.988 "get_zone_info": false, 00:16:07.988 "zone_management": false, 00:16:07.988 "zone_append": false, 00:16:07.988 "compare": false, 00:16:07.988 "compare_and_write": false, 00:16:07.988 "abort": true, 00:16:07.988 "seek_hole": false, 00:16:07.988 "seek_data": false, 00:16:07.988 "copy": true, 00:16:07.988 "nvme_iov_md": false 00:16:07.988 }, 00:16:07.988 "memory_domains": [ 00:16:07.988 { 00:16:07.988 "dma_device_id": "system", 00:16:07.988 "dma_device_type": 1 00:16:07.988 }, 00:16:07.988 { 00:16:07.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.988 "dma_device_type": 2 00:16:07.988 } 00:16:07.988 ], 00:16:07.988 "driver_specific": {} 00:16:07.988 } 00:16:07.988 ] 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.988 00:10:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.246 00:10:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.246 "name": "Existed_Raid", 00:16:08.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.246 "strip_size_kb": 0, 00:16:08.246 "state": "configuring", 00:16:08.246 "raid_level": "raid1", 00:16:08.246 "superblock": false, 00:16:08.246 "num_base_bdevs": 3, 00:16:08.246 "num_base_bdevs_discovered": 1, 00:16:08.246 "num_base_bdevs_operational": 3, 00:16:08.246 "base_bdevs_list": [ 00:16:08.246 { 00:16:08.246 "name": "BaseBdev1", 00:16:08.246 "uuid": "7ba5a6fd-0b0e-466f-9a7e-7a50856a68f4", 00:16:08.246 "is_configured": true, 00:16:08.246 "data_offset": 0, 00:16:08.246 "data_size": 65536 00:16:08.246 }, 00:16:08.246 { 00:16:08.246 "name": "BaseBdev2", 00:16:08.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.246 "is_configured": false, 00:16:08.246 "data_offset": 0, 00:16:08.246 "data_size": 0 00:16:08.246 }, 00:16:08.246 { 00:16:08.246 "name": "BaseBdev3", 00:16:08.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.246 "is_configured": false, 00:16:08.246 "data_offset": 0, 00:16:08.246 "data_size": 0 00:16:08.246 } 00:16:08.246 ] 00:16:08.246 }' 00:16:08.246 00:10:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.246 00:10:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:08.811 00:10:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:09.069 [2024-07-16 00:10:55.845157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:09.069 [2024-07-16 00:10:55.845193] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24c8310 name Existed_Raid, state configuring 00:16:09.069 00:10:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:09.327 [2024-07-16 00:10:56.093836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:09.327 [2024-07-16 00:10:56.095275] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:09.327 [2024-07-16 00:10:56.095308] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:09.327 [2024-07-16 00:10:56.095318] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:09.327 [2024-07-16 00:10:56.095330] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:09.327 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:09.327 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.328 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.591 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.591 "name": "Existed_Raid", 00:16:09.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.591 "strip_size_kb": 0, 00:16:09.591 "state": "configuring", 00:16:09.591 "raid_level": "raid1", 00:16:09.591 "superblock": false, 00:16:09.591 "num_base_bdevs": 3, 00:16:09.591 "num_base_bdevs_discovered": 1, 00:16:09.591 "num_base_bdevs_operational": 3, 00:16:09.591 "base_bdevs_list": [ 00:16:09.591 { 00:16:09.591 "name": "BaseBdev1", 00:16:09.591 "uuid": "7ba5a6fd-0b0e-466f-9a7e-7a50856a68f4", 00:16:09.591 "is_configured": true, 00:16:09.591 "data_offset": 0, 00:16:09.591 "data_size": 65536 00:16:09.591 }, 00:16:09.591 { 00:16:09.591 "name": "BaseBdev2", 00:16:09.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.591 "is_configured": false, 00:16:09.591 "data_offset": 0, 00:16:09.591 "data_size": 0 00:16:09.591 }, 00:16:09.591 { 00:16:09.591 "name": "BaseBdev3", 00:16:09.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.591 "is_configured": false, 00:16:09.591 "data_offset": 0, 00:16:09.591 "data_size": 0 00:16:09.591 } 00:16:09.591 ] 00:16:09.591 }' 00:16:09.591 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.591 00:10:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:10.158 00:10:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:10.442 [2024-07-16 00:10:57.225413] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:10.442 BaseBdev2 00:16:10.442 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:10.442 00:10:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:10.442 00:10:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:10.442 00:10:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:10.442 00:10:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:10.442 00:10:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:10.442 00:10:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:10.701 00:10:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:10.960 [ 00:16:10.960 { 00:16:10.960 "name": "BaseBdev2", 00:16:10.960 "aliases": [ 00:16:10.960 "04df4eff-83ca-4bd4-b3ca-cf4d0943858f" 00:16:10.960 ], 00:16:10.960 "product_name": "Malloc disk", 00:16:10.960 "block_size": 512, 00:16:10.960 "num_blocks": 65536, 00:16:10.960 "uuid": "04df4eff-83ca-4bd4-b3ca-cf4d0943858f", 00:16:10.960 "assigned_rate_limits": { 00:16:10.960 "rw_ios_per_sec": 0, 00:16:10.960 "rw_mbytes_per_sec": 0, 00:16:10.960 "r_mbytes_per_sec": 0, 00:16:10.960 "w_mbytes_per_sec": 0 00:16:10.960 }, 00:16:10.960 "claimed": true, 00:16:10.960 "claim_type": "exclusive_write", 00:16:10.960 "zoned": false, 00:16:10.960 "supported_io_types": { 00:16:10.960 "read": true, 00:16:10.960 "write": true, 00:16:10.960 "unmap": true, 00:16:10.960 "flush": true, 00:16:10.960 "reset": true, 00:16:10.960 "nvme_admin": false, 00:16:10.960 "nvme_io": false, 00:16:10.960 "nvme_io_md": false, 00:16:10.960 "write_zeroes": true, 00:16:10.960 "zcopy": true, 00:16:10.960 "get_zone_info": false, 00:16:10.960 "zone_management": false, 00:16:10.960 "zone_append": false, 00:16:10.960 "compare": false, 00:16:10.960 "compare_and_write": false, 00:16:10.960 "abort": true, 00:16:10.960 "seek_hole": false, 00:16:10.960 "seek_data": false, 00:16:10.960 "copy": true, 00:16:10.960 "nvme_iov_md": false 00:16:10.960 }, 00:16:10.960 "memory_domains": [ 00:16:10.960 { 00:16:10.960 "dma_device_id": "system", 00:16:10.960 "dma_device_type": 1 00:16:10.960 }, 00:16:10.960 { 00:16:10.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.960 "dma_device_type": 2 00:16:10.960 } 00:16:10.960 ], 00:16:10.960 "driver_specific": {} 00:16:10.960 } 00:16:10.960 ] 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.960 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.219 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.219 "name": "Existed_Raid", 00:16:11.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.219 "strip_size_kb": 0, 00:16:11.219 "state": "configuring", 00:16:11.219 "raid_level": "raid1", 00:16:11.219 "superblock": false, 00:16:11.219 "num_base_bdevs": 3, 00:16:11.219 "num_base_bdevs_discovered": 2, 00:16:11.219 "num_base_bdevs_operational": 3, 00:16:11.219 "base_bdevs_list": [ 00:16:11.219 { 00:16:11.219 "name": "BaseBdev1", 00:16:11.219 "uuid": "7ba5a6fd-0b0e-466f-9a7e-7a50856a68f4", 00:16:11.219 "is_configured": true, 00:16:11.219 "data_offset": 0, 00:16:11.219 "data_size": 65536 00:16:11.219 }, 00:16:11.219 { 00:16:11.219 "name": "BaseBdev2", 00:16:11.219 "uuid": "04df4eff-83ca-4bd4-b3ca-cf4d0943858f", 00:16:11.219 "is_configured": true, 00:16:11.219 "data_offset": 0, 00:16:11.219 "data_size": 65536 00:16:11.219 }, 00:16:11.219 { 00:16:11.219 "name": "BaseBdev3", 00:16:11.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.219 "is_configured": false, 00:16:11.219 "data_offset": 0, 00:16:11.219 "data_size": 0 00:16:11.219 } 00:16:11.219 ] 00:16:11.219 }' 00:16:11.219 00:10:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.219 00:10:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.789 00:10:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:11.789 [2024-07-16 00:10:58.724740] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:11.789 [2024-07-16 00:10:58.724778] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24c9400 00:16:11.789 [2024-07-16 00:10:58.724786] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:11.789 [2024-07-16 00:10:58.725041] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24c8ef0 00:16:11.789 [2024-07-16 00:10:58.725165] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24c9400 00:16:11.789 [2024-07-16 00:10:58.725175] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24c9400 00:16:11.789 [2024-07-16 00:10:58.725333] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:11.789 BaseBdev3 00:16:12.048 00:10:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:12.048 00:10:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:12.048 00:10:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:12.048 00:10:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:12.048 00:10:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:12.048 00:10:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:12.048 00:10:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:12.048 00:10:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:12.307 [ 00:16:12.307 { 00:16:12.307 "name": "BaseBdev3", 00:16:12.307 "aliases": [ 00:16:12.307 "9366640b-074b-43c1-9bb7-2080706521fd" 00:16:12.307 ], 00:16:12.307 "product_name": "Malloc disk", 00:16:12.307 "block_size": 512, 00:16:12.307 "num_blocks": 65536, 00:16:12.307 "uuid": "9366640b-074b-43c1-9bb7-2080706521fd", 00:16:12.307 "assigned_rate_limits": { 00:16:12.307 "rw_ios_per_sec": 0, 00:16:12.307 "rw_mbytes_per_sec": 0, 00:16:12.307 "r_mbytes_per_sec": 0, 00:16:12.307 "w_mbytes_per_sec": 0 00:16:12.307 }, 00:16:12.307 "claimed": true, 00:16:12.307 "claim_type": "exclusive_write", 00:16:12.307 "zoned": false, 00:16:12.307 "supported_io_types": { 00:16:12.307 "read": true, 00:16:12.307 "write": true, 00:16:12.307 "unmap": true, 00:16:12.307 "flush": true, 00:16:12.307 "reset": true, 00:16:12.307 "nvme_admin": false, 00:16:12.307 "nvme_io": false, 00:16:12.307 "nvme_io_md": false, 00:16:12.307 "write_zeroes": true, 00:16:12.307 "zcopy": true, 00:16:12.307 "get_zone_info": false, 00:16:12.307 "zone_management": false, 00:16:12.307 "zone_append": false, 00:16:12.307 "compare": false, 00:16:12.307 "compare_and_write": false, 00:16:12.307 "abort": true, 00:16:12.307 "seek_hole": false, 00:16:12.307 "seek_data": false, 00:16:12.307 "copy": true, 00:16:12.307 "nvme_iov_md": false 00:16:12.307 }, 00:16:12.307 "memory_domains": [ 00:16:12.307 { 00:16:12.307 "dma_device_id": "system", 00:16:12.307 "dma_device_type": 1 00:16:12.307 }, 00:16:12.307 { 00:16:12.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.307 "dma_device_type": 2 00:16:12.307 } 00:16:12.307 ], 00:16:12.307 "driver_specific": {} 00:16:12.307 } 00:16:12.307 ] 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.307 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.567 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.567 "name": "Existed_Raid", 00:16:12.567 "uuid": "1d4a4185-4dc2-469c-80fe-f6f6f6155520", 00:16:12.567 "strip_size_kb": 0, 00:16:12.567 "state": "online", 00:16:12.567 "raid_level": "raid1", 00:16:12.567 "superblock": false, 00:16:12.567 "num_base_bdevs": 3, 00:16:12.567 "num_base_bdevs_discovered": 3, 00:16:12.567 "num_base_bdevs_operational": 3, 00:16:12.567 "base_bdevs_list": [ 00:16:12.567 { 00:16:12.567 "name": "BaseBdev1", 00:16:12.567 "uuid": "7ba5a6fd-0b0e-466f-9a7e-7a50856a68f4", 00:16:12.567 "is_configured": true, 00:16:12.567 "data_offset": 0, 00:16:12.567 "data_size": 65536 00:16:12.567 }, 00:16:12.567 { 00:16:12.567 "name": "BaseBdev2", 00:16:12.567 "uuid": "04df4eff-83ca-4bd4-b3ca-cf4d0943858f", 00:16:12.567 "is_configured": true, 00:16:12.567 "data_offset": 0, 00:16:12.567 "data_size": 65536 00:16:12.567 }, 00:16:12.567 { 00:16:12.567 "name": "BaseBdev3", 00:16:12.567 "uuid": "9366640b-074b-43c1-9bb7-2080706521fd", 00:16:12.567 "is_configured": true, 00:16:12.567 "data_offset": 0, 00:16:12.567 "data_size": 65536 00:16:12.567 } 00:16:12.567 ] 00:16:12.567 }' 00:16:12.567 00:10:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.567 00:10:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:13.504 [2024-07-16 00:11:00.337363] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:13.504 "name": "Existed_Raid", 00:16:13.504 "aliases": [ 00:16:13.504 "1d4a4185-4dc2-469c-80fe-f6f6f6155520" 00:16:13.504 ], 00:16:13.504 "product_name": "Raid Volume", 00:16:13.504 "block_size": 512, 00:16:13.504 "num_blocks": 65536, 00:16:13.504 "uuid": "1d4a4185-4dc2-469c-80fe-f6f6f6155520", 00:16:13.504 "assigned_rate_limits": { 00:16:13.504 "rw_ios_per_sec": 0, 00:16:13.504 "rw_mbytes_per_sec": 0, 00:16:13.504 "r_mbytes_per_sec": 0, 00:16:13.504 "w_mbytes_per_sec": 0 00:16:13.504 }, 00:16:13.504 "claimed": false, 00:16:13.504 "zoned": false, 00:16:13.504 "supported_io_types": { 00:16:13.504 "read": true, 00:16:13.504 "write": true, 00:16:13.504 "unmap": false, 00:16:13.504 "flush": false, 00:16:13.504 "reset": true, 00:16:13.504 "nvme_admin": false, 00:16:13.504 "nvme_io": false, 00:16:13.504 "nvme_io_md": false, 00:16:13.504 "write_zeroes": true, 00:16:13.504 "zcopy": false, 00:16:13.504 "get_zone_info": false, 00:16:13.504 "zone_management": false, 00:16:13.504 "zone_append": false, 00:16:13.504 "compare": false, 00:16:13.504 "compare_and_write": false, 00:16:13.504 "abort": false, 00:16:13.504 "seek_hole": false, 00:16:13.504 "seek_data": false, 00:16:13.504 "copy": false, 00:16:13.504 "nvme_iov_md": false 00:16:13.504 }, 00:16:13.504 "memory_domains": [ 00:16:13.504 { 00:16:13.504 "dma_device_id": "system", 00:16:13.504 "dma_device_type": 1 00:16:13.504 }, 00:16:13.504 { 00:16:13.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.504 "dma_device_type": 2 00:16:13.504 }, 00:16:13.504 { 00:16:13.504 "dma_device_id": "system", 00:16:13.504 "dma_device_type": 1 00:16:13.504 }, 00:16:13.504 { 00:16:13.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.504 "dma_device_type": 2 00:16:13.504 }, 00:16:13.504 { 00:16:13.504 "dma_device_id": "system", 00:16:13.504 "dma_device_type": 1 00:16:13.504 }, 00:16:13.504 { 00:16:13.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.504 "dma_device_type": 2 00:16:13.504 } 00:16:13.504 ], 00:16:13.504 "driver_specific": { 00:16:13.504 "raid": { 00:16:13.504 "uuid": "1d4a4185-4dc2-469c-80fe-f6f6f6155520", 00:16:13.504 "strip_size_kb": 0, 00:16:13.504 "state": "online", 00:16:13.504 "raid_level": "raid1", 00:16:13.504 "superblock": false, 00:16:13.504 "num_base_bdevs": 3, 00:16:13.504 "num_base_bdevs_discovered": 3, 00:16:13.504 "num_base_bdevs_operational": 3, 00:16:13.504 "base_bdevs_list": [ 00:16:13.504 { 00:16:13.504 "name": "BaseBdev1", 00:16:13.504 "uuid": "7ba5a6fd-0b0e-466f-9a7e-7a50856a68f4", 00:16:13.504 "is_configured": true, 00:16:13.504 "data_offset": 0, 00:16:13.504 "data_size": 65536 00:16:13.504 }, 00:16:13.504 { 00:16:13.504 "name": "BaseBdev2", 00:16:13.504 "uuid": "04df4eff-83ca-4bd4-b3ca-cf4d0943858f", 00:16:13.504 "is_configured": true, 00:16:13.504 "data_offset": 0, 00:16:13.504 "data_size": 65536 00:16:13.504 }, 00:16:13.504 { 00:16:13.504 "name": "BaseBdev3", 00:16:13.504 "uuid": "9366640b-074b-43c1-9bb7-2080706521fd", 00:16:13.504 "is_configured": true, 00:16:13.504 "data_offset": 0, 00:16:13.504 "data_size": 65536 00:16:13.504 } 00:16:13.504 ] 00:16:13.504 } 00:16:13.504 } 00:16:13.504 }' 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:13.504 BaseBdev2 00:16:13.504 BaseBdev3' 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:13.504 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:13.764 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:13.764 "name": "BaseBdev1", 00:16:13.764 "aliases": [ 00:16:13.764 "7ba5a6fd-0b0e-466f-9a7e-7a50856a68f4" 00:16:13.764 ], 00:16:13.764 "product_name": "Malloc disk", 00:16:13.764 "block_size": 512, 00:16:13.764 "num_blocks": 65536, 00:16:13.764 "uuid": "7ba5a6fd-0b0e-466f-9a7e-7a50856a68f4", 00:16:13.764 "assigned_rate_limits": { 00:16:13.764 "rw_ios_per_sec": 0, 00:16:13.764 "rw_mbytes_per_sec": 0, 00:16:13.764 "r_mbytes_per_sec": 0, 00:16:13.764 "w_mbytes_per_sec": 0 00:16:13.764 }, 00:16:13.764 "claimed": true, 00:16:13.764 "claim_type": "exclusive_write", 00:16:13.764 "zoned": false, 00:16:13.764 "supported_io_types": { 00:16:13.764 "read": true, 00:16:13.764 "write": true, 00:16:13.764 "unmap": true, 00:16:13.764 "flush": true, 00:16:13.764 "reset": true, 00:16:13.764 "nvme_admin": false, 00:16:13.764 "nvme_io": false, 00:16:13.764 "nvme_io_md": false, 00:16:13.764 "write_zeroes": true, 00:16:13.764 "zcopy": true, 00:16:13.764 "get_zone_info": false, 00:16:13.764 "zone_management": false, 00:16:13.764 "zone_append": false, 00:16:13.764 "compare": false, 00:16:13.764 "compare_and_write": false, 00:16:13.764 "abort": true, 00:16:13.764 "seek_hole": false, 00:16:13.764 "seek_data": false, 00:16:13.764 "copy": true, 00:16:13.764 "nvme_iov_md": false 00:16:13.764 }, 00:16:13.764 "memory_domains": [ 00:16:13.764 { 00:16:13.764 "dma_device_id": "system", 00:16:13.764 "dma_device_type": 1 00:16:13.764 }, 00:16:13.764 { 00:16:13.764 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.764 "dma_device_type": 2 00:16:13.764 } 00:16:13.764 ], 00:16:13.764 "driver_specific": {} 00:16:13.764 }' 00:16:13.764 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:13.764 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.023 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:14.023 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.023 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.023 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:14.023 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:14.023 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:14.023 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:14.023 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:14.282 00:11:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:14.282 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:14.282 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:14.282 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:14.282 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:14.541 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:14.541 "name": "BaseBdev2", 00:16:14.541 "aliases": [ 00:16:14.541 "04df4eff-83ca-4bd4-b3ca-cf4d0943858f" 00:16:14.541 ], 00:16:14.541 "product_name": "Malloc disk", 00:16:14.541 "block_size": 512, 00:16:14.541 "num_blocks": 65536, 00:16:14.541 "uuid": "04df4eff-83ca-4bd4-b3ca-cf4d0943858f", 00:16:14.541 "assigned_rate_limits": { 00:16:14.541 "rw_ios_per_sec": 0, 00:16:14.541 "rw_mbytes_per_sec": 0, 00:16:14.541 "r_mbytes_per_sec": 0, 00:16:14.541 "w_mbytes_per_sec": 0 00:16:14.541 }, 00:16:14.541 "claimed": true, 00:16:14.541 "claim_type": "exclusive_write", 00:16:14.541 "zoned": false, 00:16:14.541 "supported_io_types": { 00:16:14.541 "read": true, 00:16:14.541 "write": true, 00:16:14.541 "unmap": true, 00:16:14.541 "flush": true, 00:16:14.541 "reset": true, 00:16:14.541 "nvme_admin": false, 00:16:14.541 "nvme_io": false, 00:16:14.541 "nvme_io_md": false, 00:16:14.541 "write_zeroes": true, 00:16:14.541 "zcopy": true, 00:16:14.541 "get_zone_info": false, 00:16:14.541 "zone_management": false, 00:16:14.541 "zone_append": false, 00:16:14.541 "compare": false, 00:16:14.541 "compare_and_write": false, 00:16:14.541 "abort": true, 00:16:14.541 "seek_hole": false, 00:16:14.541 "seek_data": false, 00:16:14.541 "copy": true, 00:16:14.541 "nvme_iov_md": false 00:16:14.541 }, 00:16:14.541 "memory_domains": [ 00:16:14.541 { 00:16:14.541 "dma_device_id": "system", 00:16:14.541 "dma_device_type": 1 00:16:14.541 }, 00:16:14.541 { 00:16:14.541 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.541 "dma_device_type": 2 00:16:14.541 } 00:16:14.541 ], 00:16:14.541 "driver_specific": {} 00:16:14.541 }' 00:16:14.541 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.541 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.541 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:14.541 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.541 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.541 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:14.541 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:14.541 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:14.800 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:14.800 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:14.800 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:14.800 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:14.800 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:14.800 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:14.800 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:15.059 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:15.059 "name": "BaseBdev3", 00:16:15.059 "aliases": [ 00:16:15.059 "9366640b-074b-43c1-9bb7-2080706521fd" 00:16:15.059 ], 00:16:15.059 "product_name": "Malloc disk", 00:16:15.059 "block_size": 512, 00:16:15.059 "num_blocks": 65536, 00:16:15.059 "uuid": "9366640b-074b-43c1-9bb7-2080706521fd", 00:16:15.059 "assigned_rate_limits": { 00:16:15.059 "rw_ios_per_sec": 0, 00:16:15.059 "rw_mbytes_per_sec": 0, 00:16:15.059 "r_mbytes_per_sec": 0, 00:16:15.059 "w_mbytes_per_sec": 0 00:16:15.059 }, 00:16:15.059 "claimed": true, 00:16:15.059 "claim_type": "exclusive_write", 00:16:15.059 "zoned": false, 00:16:15.059 "supported_io_types": { 00:16:15.059 "read": true, 00:16:15.059 "write": true, 00:16:15.059 "unmap": true, 00:16:15.060 "flush": true, 00:16:15.060 "reset": true, 00:16:15.060 "nvme_admin": false, 00:16:15.060 "nvme_io": false, 00:16:15.060 "nvme_io_md": false, 00:16:15.060 "write_zeroes": true, 00:16:15.060 "zcopy": true, 00:16:15.060 "get_zone_info": false, 00:16:15.060 "zone_management": false, 00:16:15.060 "zone_append": false, 00:16:15.060 "compare": false, 00:16:15.060 "compare_and_write": false, 00:16:15.060 "abort": true, 00:16:15.060 "seek_hole": false, 00:16:15.060 "seek_data": false, 00:16:15.060 "copy": true, 00:16:15.060 "nvme_iov_md": false 00:16:15.060 }, 00:16:15.060 "memory_domains": [ 00:16:15.060 { 00:16:15.060 "dma_device_id": "system", 00:16:15.060 "dma_device_type": 1 00:16:15.060 }, 00:16:15.060 { 00:16:15.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.060 "dma_device_type": 2 00:16:15.060 } 00:16:15.060 ], 00:16:15.060 "driver_specific": {} 00:16:15.060 }' 00:16:15.060 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.060 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.060 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:15.060 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.060 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.060 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:15.060 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:15.060 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:15.060 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:15.060 00:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.319 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.319 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:15.319 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:15.578 [2024-07-16 00:11:02.286317] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.578 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:15.837 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.837 "name": "Existed_Raid", 00:16:15.837 "uuid": "1d4a4185-4dc2-469c-80fe-f6f6f6155520", 00:16:15.837 "strip_size_kb": 0, 00:16:15.837 "state": "online", 00:16:15.837 "raid_level": "raid1", 00:16:15.837 "superblock": false, 00:16:15.837 "num_base_bdevs": 3, 00:16:15.837 "num_base_bdevs_discovered": 2, 00:16:15.837 "num_base_bdevs_operational": 2, 00:16:15.837 "base_bdevs_list": [ 00:16:15.837 { 00:16:15.837 "name": null, 00:16:15.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:15.837 "is_configured": false, 00:16:15.837 "data_offset": 0, 00:16:15.837 "data_size": 65536 00:16:15.837 }, 00:16:15.837 { 00:16:15.837 "name": "BaseBdev2", 00:16:15.837 "uuid": "04df4eff-83ca-4bd4-b3ca-cf4d0943858f", 00:16:15.837 "is_configured": true, 00:16:15.837 "data_offset": 0, 00:16:15.837 "data_size": 65536 00:16:15.837 }, 00:16:15.837 { 00:16:15.837 "name": "BaseBdev3", 00:16:15.837 "uuid": "9366640b-074b-43c1-9bb7-2080706521fd", 00:16:15.837 "is_configured": true, 00:16:15.837 "data_offset": 0, 00:16:15.837 "data_size": 65536 00:16:15.837 } 00:16:15.837 ] 00:16:15.837 }' 00:16:15.837 00:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.837 00:11:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.402 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:16.402 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:16.402 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:16.402 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.660 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:16.660 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:16.660 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:17.227 [2024-07-16 00:11:03.911682] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:17.227 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:17.227 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:17.227 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.227 00:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:17.485 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:17.485 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:17.485 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:17.485 [2024-07-16 00:11:04.425481] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:17.486 [2024-07-16 00:11:04.425563] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:17.744 [2024-07-16 00:11:04.438068] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:17.744 [2024-07-16 00:11:04.438100] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:17.744 [2024-07-16 00:11:04.438111] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24c9400 name Existed_Raid, state offline 00:16:17.744 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:17.744 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:17.744 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.744 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:18.003 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:18.003 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:18.003 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:18.003 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:18.003 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:18.003 00:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:18.262 BaseBdev2 00:16:18.520 00:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:18.520 00:11:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:18.520 00:11:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:18.520 00:11:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:18.520 00:11:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:18.520 00:11:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:18.520 00:11:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:18.520 00:11:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:19.087 [ 00:16:19.087 { 00:16:19.087 "name": "BaseBdev2", 00:16:19.087 "aliases": [ 00:16:19.087 "fca72df9-71df-4e6f-8817-39e484a06478" 00:16:19.087 ], 00:16:19.087 "product_name": "Malloc disk", 00:16:19.087 "block_size": 512, 00:16:19.087 "num_blocks": 65536, 00:16:19.087 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:19.087 "assigned_rate_limits": { 00:16:19.087 "rw_ios_per_sec": 0, 00:16:19.087 "rw_mbytes_per_sec": 0, 00:16:19.087 "r_mbytes_per_sec": 0, 00:16:19.087 "w_mbytes_per_sec": 0 00:16:19.087 }, 00:16:19.087 "claimed": false, 00:16:19.087 "zoned": false, 00:16:19.087 "supported_io_types": { 00:16:19.087 "read": true, 00:16:19.087 "write": true, 00:16:19.087 "unmap": true, 00:16:19.087 "flush": true, 00:16:19.087 "reset": true, 00:16:19.087 "nvme_admin": false, 00:16:19.087 "nvme_io": false, 00:16:19.087 "nvme_io_md": false, 00:16:19.087 "write_zeroes": true, 00:16:19.087 "zcopy": true, 00:16:19.087 "get_zone_info": false, 00:16:19.087 "zone_management": false, 00:16:19.087 "zone_append": false, 00:16:19.087 "compare": false, 00:16:19.087 "compare_and_write": false, 00:16:19.087 "abort": true, 00:16:19.087 "seek_hole": false, 00:16:19.087 "seek_data": false, 00:16:19.087 "copy": true, 00:16:19.087 "nvme_iov_md": false 00:16:19.087 }, 00:16:19.087 "memory_domains": [ 00:16:19.087 { 00:16:19.087 "dma_device_id": "system", 00:16:19.087 "dma_device_type": 1 00:16:19.087 }, 00:16:19.087 { 00:16:19.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.087 "dma_device_type": 2 00:16:19.087 } 00:16:19.087 ], 00:16:19.087 "driver_specific": {} 00:16:19.087 } 00:16:19.087 ] 00:16:19.087 00:11:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:19.087 00:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:19.087 00:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:19.087 00:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:19.346 BaseBdev3 00:16:19.346 00:11:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:19.346 00:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:19.346 00:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:19.346 00:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:19.346 00:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:19.346 00:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:19.346 00:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:19.915 00:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:20.204 [ 00:16:20.204 { 00:16:20.204 "name": "BaseBdev3", 00:16:20.204 "aliases": [ 00:16:20.204 "1fdab84f-0dc4-476b-8228-a71164258c65" 00:16:20.204 ], 00:16:20.204 "product_name": "Malloc disk", 00:16:20.204 "block_size": 512, 00:16:20.204 "num_blocks": 65536, 00:16:20.204 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:20.204 "assigned_rate_limits": { 00:16:20.204 "rw_ios_per_sec": 0, 00:16:20.204 "rw_mbytes_per_sec": 0, 00:16:20.204 "r_mbytes_per_sec": 0, 00:16:20.204 "w_mbytes_per_sec": 0 00:16:20.204 }, 00:16:20.204 "claimed": false, 00:16:20.204 "zoned": false, 00:16:20.204 "supported_io_types": { 00:16:20.204 "read": true, 00:16:20.204 "write": true, 00:16:20.204 "unmap": true, 00:16:20.204 "flush": true, 00:16:20.205 "reset": true, 00:16:20.205 "nvme_admin": false, 00:16:20.205 "nvme_io": false, 00:16:20.205 "nvme_io_md": false, 00:16:20.205 "write_zeroes": true, 00:16:20.205 "zcopy": true, 00:16:20.205 "get_zone_info": false, 00:16:20.205 "zone_management": false, 00:16:20.205 "zone_append": false, 00:16:20.205 "compare": false, 00:16:20.205 "compare_and_write": false, 00:16:20.205 "abort": true, 00:16:20.205 "seek_hole": false, 00:16:20.205 "seek_data": false, 00:16:20.205 "copy": true, 00:16:20.205 "nvme_iov_md": false 00:16:20.205 }, 00:16:20.205 "memory_domains": [ 00:16:20.205 { 00:16:20.205 "dma_device_id": "system", 00:16:20.205 "dma_device_type": 1 00:16:20.205 }, 00:16:20.205 { 00:16:20.205 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.205 "dma_device_type": 2 00:16:20.205 } 00:16:20.205 ], 00:16:20.205 "driver_specific": {} 00:16:20.205 } 00:16:20.205 ] 00:16:20.205 00:11:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:20.205 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:20.205 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:20.205 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:20.462 [2024-07-16 00:11:07.235231] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:20.463 [2024-07-16 00:11:07.235271] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:20.463 [2024-07-16 00:11:07.235288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:20.463 [2024-07-16 00:11:07.236602] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.463 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:20.721 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.721 "name": "Existed_Raid", 00:16:20.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.721 "strip_size_kb": 0, 00:16:20.721 "state": "configuring", 00:16:20.721 "raid_level": "raid1", 00:16:20.721 "superblock": false, 00:16:20.721 "num_base_bdevs": 3, 00:16:20.721 "num_base_bdevs_discovered": 2, 00:16:20.721 "num_base_bdevs_operational": 3, 00:16:20.721 "base_bdevs_list": [ 00:16:20.721 { 00:16:20.721 "name": "BaseBdev1", 00:16:20.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.721 "is_configured": false, 00:16:20.721 "data_offset": 0, 00:16:20.721 "data_size": 0 00:16:20.721 }, 00:16:20.721 { 00:16:20.721 "name": "BaseBdev2", 00:16:20.721 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:20.721 "is_configured": true, 00:16:20.721 "data_offset": 0, 00:16:20.721 "data_size": 65536 00:16:20.721 }, 00:16:20.721 { 00:16:20.721 "name": "BaseBdev3", 00:16:20.721 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:20.721 "is_configured": true, 00:16:20.721 "data_offset": 0, 00:16:20.721 "data_size": 65536 00:16:20.721 } 00:16:20.721 ] 00:16:20.721 }' 00:16:20.721 00:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.721 00:11:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.288 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:21.547 [2024-07-16 00:11:08.350173] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.547 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:21.806 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.806 "name": "Existed_Raid", 00:16:21.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:21.806 "strip_size_kb": 0, 00:16:21.806 "state": "configuring", 00:16:21.806 "raid_level": "raid1", 00:16:21.806 "superblock": false, 00:16:21.806 "num_base_bdevs": 3, 00:16:21.806 "num_base_bdevs_discovered": 1, 00:16:21.806 "num_base_bdevs_operational": 3, 00:16:21.806 "base_bdevs_list": [ 00:16:21.806 { 00:16:21.806 "name": "BaseBdev1", 00:16:21.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:21.806 "is_configured": false, 00:16:21.806 "data_offset": 0, 00:16:21.806 "data_size": 0 00:16:21.806 }, 00:16:21.806 { 00:16:21.806 "name": null, 00:16:21.806 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:21.806 "is_configured": false, 00:16:21.806 "data_offset": 0, 00:16:21.806 "data_size": 65536 00:16:21.806 }, 00:16:21.806 { 00:16:21.806 "name": "BaseBdev3", 00:16:21.806 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:21.806 "is_configured": true, 00:16:21.806 "data_offset": 0, 00:16:21.806 "data_size": 65536 00:16:21.806 } 00:16:21.806 ] 00:16:21.806 }' 00:16:21.806 00:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.806 00:11:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.373 00:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.373 00:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:22.631 00:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:22.631 00:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:22.890 [2024-07-16 00:11:09.709148] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:22.890 BaseBdev1 00:16:22.890 00:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:22.890 00:11:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:22.890 00:11:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:22.890 00:11:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:22.890 00:11:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:22.890 00:11:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:22.890 00:11:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:23.147 00:11:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:23.405 [ 00:16:23.405 { 00:16:23.405 "name": "BaseBdev1", 00:16:23.405 "aliases": [ 00:16:23.405 "d39cb5be-e797-46bb-b934-1e8af75363da" 00:16:23.405 ], 00:16:23.405 "product_name": "Malloc disk", 00:16:23.405 "block_size": 512, 00:16:23.405 "num_blocks": 65536, 00:16:23.405 "uuid": "d39cb5be-e797-46bb-b934-1e8af75363da", 00:16:23.405 "assigned_rate_limits": { 00:16:23.405 "rw_ios_per_sec": 0, 00:16:23.405 "rw_mbytes_per_sec": 0, 00:16:23.405 "r_mbytes_per_sec": 0, 00:16:23.405 "w_mbytes_per_sec": 0 00:16:23.405 }, 00:16:23.405 "claimed": true, 00:16:23.405 "claim_type": "exclusive_write", 00:16:23.405 "zoned": false, 00:16:23.405 "supported_io_types": { 00:16:23.405 "read": true, 00:16:23.405 "write": true, 00:16:23.405 "unmap": true, 00:16:23.405 "flush": true, 00:16:23.405 "reset": true, 00:16:23.405 "nvme_admin": false, 00:16:23.405 "nvme_io": false, 00:16:23.405 "nvme_io_md": false, 00:16:23.405 "write_zeroes": true, 00:16:23.405 "zcopy": true, 00:16:23.405 "get_zone_info": false, 00:16:23.405 "zone_management": false, 00:16:23.405 "zone_append": false, 00:16:23.405 "compare": false, 00:16:23.405 "compare_and_write": false, 00:16:23.405 "abort": true, 00:16:23.405 "seek_hole": false, 00:16:23.405 "seek_data": false, 00:16:23.405 "copy": true, 00:16:23.405 "nvme_iov_md": false 00:16:23.405 }, 00:16:23.405 "memory_domains": [ 00:16:23.405 { 00:16:23.405 "dma_device_id": "system", 00:16:23.405 "dma_device_type": 1 00:16:23.405 }, 00:16:23.405 { 00:16:23.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.405 "dma_device_type": 2 00:16:23.405 } 00:16:23.405 ], 00:16:23.405 "driver_specific": {} 00:16:23.405 } 00:16:23.405 ] 00:16:23.405 00:11:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:23.405 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:23.405 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:23.405 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:23.405 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:23.406 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:23.406 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:23.406 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:23.406 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:23.406 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:23.406 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:23.406 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.406 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.664 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.664 "name": "Existed_Raid", 00:16:23.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.664 "strip_size_kb": 0, 00:16:23.664 "state": "configuring", 00:16:23.664 "raid_level": "raid1", 00:16:23.664 "superblock": false, 00:16:23.664 "num_base_bdevs": 3, 00:16:23.664 "num_base_bdevs_discovered": 2, 00:16:23.664 "num_base_bdevs_operational": 3, 00:16:23.664 "base_bdevs_list": [ 00:16:23.664 { 00:16:23.664 "name": "BaseBdev1", 00:16:23.664 "uuid": "d39cb5be-e797-46bb-b934-1e8af75363da", 00:16:23.664 "is_configured": true, 00:16:23.664 "data_offset": 0, 00:16:23.664 "data_size": 65536 00:16:23.664 }, 00:16:23.664 { 00:16:23.664 "name": null, 00:16:23.664 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:23.664 "is_configured": false, 00:16:23.664 "data_offset": 0, 00:16:23.664 "data_size": 65536 00:16:23.664 }, 00:16:23.664 { 00:16:23.664 "name": "BaseBdev3", 00:16:23.664 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:23.664 "is_configured": true, 00:16:23.664 "data_offset": 0, 00:16:23.664 "data_size": 65536 00:16:23.664 } 00:16:23.664 ] 00:16:23.664 }' 00:16:23.664 00:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.664 00:11:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.600 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:24.600 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.858 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:24.858 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:25.117 [2024-07-16 00:11:11.834994] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.117 00:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:25.375 00:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:25.375 "name": "Existed_Raid", 00:16:25.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.375 "strip_size_kb": 0, 00:16:25.375 "state": "configuring", 00:16:25.375 "raid_level": "raid1", 00:16:25.375 "superblock": false, 00:16:25.375 "num_base_bdevs": 3, 00:16:25.375 "num_base_bdevs_discovered": 1, 00:16:25.375 "num_base_bdevs_operational": 3, 00:16:25.375 "base_bdevs_list": [ 00:16:25.375 { 00:16:25.375 "name": "BaseBdev1", 00:16:25.375 "uuid": "d39cb5be-e797-46bb-b934-1e8af75363da", 00:16:25.375 "is_configured": true, 00:16:25.375 "data_offset": 0, 00:16:25.375 "data_size": 65536 00:16:25.375 }, 00:16:25.375 { 00:16:25.375 "name": null, 00:16:25.375 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:25.375 "is_configured": false, 00:16:25.375 "data_offset": 0, 00:16:25.375 "data_size": 65536 00:16:25.375 }, 00:16:25.375 { 00:16:25.375 "name": null, 00:16:25.375 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:25.375 "is_configured": false, 00:16:25.375 "data_offset": 0, 00:16:25.375 "data_size": 65536 00:16:25.375 } 00:16:25.375 ] 00:16:25.375 }' 00:16:25.375 00:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:25.375 00:11:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.941 00:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.941 00:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:26.201 00:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:26.201 00:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:26.460 [2024-07-16 00:11:13.166536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.460 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:26.717 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.717 "name": "Existed_Raid", 00:16:26.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.717 "strip_size_kb": 0, 00:16:26.717 "state": "configuring", 00:16:26.717 "raid_level": "raid1", 00:16:26.717 "superblock": false, 00:16:26.717 "num_base_bdevs": 3, 00:16:26.717 "num_base_bdevs_discovered": 2, 00:16:26.717 "num_base_bdevs_operational": 3, 00:16:26.717 "base_bdevs_list": [ 00:16:26.717 { 00:16:26.717 "name": "BaseBdev1", 00:16:26.717 "uuid": "d39cb5be-e797-46bb-b934-1e8af75363da", 00:16:26.717 "is_configured": true, 00:16:26.717 "data_offset": 0, 00:16:26.717 "data_size": 65536 00:16:26.717 }, 00:16:26.717 { 00:16:26.717 "name": null, 00:16:26.717 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:26.717 "is_configured": false, 00:16:26.717 "data_offset": 0, 00:16:26.717 "data_size": 65536 00:16:26.717 }, 00:16:26.717 { 00:16:26.717 "name": "BaseBdev3", 00:16:26.717 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:26.717 "is_configured": true, 00:16:26.717 "data_offset": 0, 00:16:26.717 "data_size": 65536 00:16:26.717 } 00:16:26.717 ] 00:16:26.717 }' 00:16:26.717 00:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.717 00:11:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.282 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.282 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:27.540 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:27.540 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:27.799 [2024-07-16 00:11:14.498087] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.799 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:28.059 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.059 "name": "Existed_Raid", 00:16:28.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.059 "strip_size_kb": 0, 00:16:28.059 "state": "configuring", 00:16:28.059 "raid_level": "raid1", 00:16:28.059 "superblock": false, 00:16:28.059 "num_base_bdevs": 3, 00:16:28.059 "num_base_bdevs_discovered": 1, 00:16:28.059 "num_base_bdevs_operational": 3, 00:16:28.059 "base_bdevs_list": [ 00:16:28.059 { 00:16:28.059 "name": null, 00:16:28.059 "uuid": "d39cb5be-e797-46bb-b934-1e8af75363da", 00:16:28.059 "is_configured": false, 00:16:28.059 "data_offset": 0, 00:16:28.059 "data_size": 65536 00:16:28.059 }, 00:16:28.059 { 00:16:28.059 "name": null, 00:16:28.059 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:28.059 "is_configured": false, 00:16:28.059 "data_offset": 0, 00:16:28.059 "data_size": 65536 00:16:28.059 }, 00:16:28.059 { 00:16:28.059 "name": "BaseBdev3", 00:16:28.059 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:28.059 "is_configured": true, 00:16:28.059 "data_offset": 0, 00:16:28.059 "data_size": 65536 00:16:28.059 } 00:16:28.059 ] 00:16:28.059 }' 00:16:28.059 00:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.059 00:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.627 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.627 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:28.627 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:28.627 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:28.886 [2024-07-16 00:11:15.793950] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.886 00:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.145 00:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.145 "name": "Existed_Raid", 00:16:29.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.145 "strip_size_kb": 0, 00:16:29.145 "state": "configuring", 00:16:29.145 "raid_level": "raid1", 00:16:29.145 "superblock": false, 00:16:29.145 "num_base_bdevs": 3, 00:16:29.145 "num_base_bdevs_discovered": 2, 00:16:29.145 "num_base_bdevs_operational": 3, 00:16:29.145 "base_bdevs_list": [ 00:16:29.145 { 00:16:29.145 "name": null, 00:16:29.145 "uuid": "d39cb5be-e797-46bb-b934-1e8af75363da", 00:16:29.145 "is_configured": false, 00:16:29.145 "data_offset": 0, 00:16:29.145 "data_size": 65536 00:16:29.145 }, 00:16:29.145 { 00:16:29.145 "name": "BaseBdev2", 00:16:29.145 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:29.145 "is_configured": true, 00:16:29.145 "data_offset": 0, 00:16:29.145 "data_size": 65536 00:16:29.145 }, 00:16:29.145 { 00:16:29.145 "name": "BaseBdev3", 00:16:29.145 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:29.145 "is_configured": true, 00:16:29.145 "data_offset": 0, 00:16:29.145 "data_size": 65536 00:16:29.145 } 00:16:29.145 ] 00:16:29.145 }' 00:16:29.145 00:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.145 00:11:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.713 00:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:29.713 00:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.971 00:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:29.971 00:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.971 00:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:30.229 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d39cb5be-e797-46bb-b934-1e8af75363da 00:16:30.488 [2024-07-16 00:11:17.361430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:30.488 [2024-07-16 00:11:17.361468] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24cce40 00:16:30.488 [2024-07-16 00:11:17.361476] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:30.488 [2024-07-16 00:11:17.361661] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24c9e60 00:16:30.488 [2024-07-16 00:11:17.361780] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24cce40 00:16:30.488 [2024-07-16 00:11:17.361790] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24cce40 00:16:30.488 [2024-07-16 00:11:17.361963] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:30.488 NewBaseBdev 00:16:30.488 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:30.488 00:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:30.488 00:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:30.488 00:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:30.488 00:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:30.488 00:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:30.488 00:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.746 00:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:31.005 [ 00:16:31.005 { 00:16:31.005 "name": "NewBaseBdev", 00:16:31.005 "aliases": [ 00:16:31.005 "d39cb5be-e797-46bb-b934-1e8af75363da" 00:16:31.005 ], 00:16:31.005 "product_name": "Malloc disk", 00:16:31.005 "block_size": 512, 00:16:31.005 "num_blocks": 65536, 00:16:31.005 "uuid": "d39cb5be-e797-46bb-b934-1e8af75363da", 00:16:31.005 "assigned_rate_limits": { 00:16:31.005 "rw_ios_per_sec": 0, 00:16:31.005 "rw_mbytes_per_sec": 0, 00:16:31.005 "r_mbytes_per_sec": 0, 00:16:31.005 "w_mbytes_per_sec": 0 00:16:31.005 }, 00:16:31.005 "claimed": true, 00:16:31.005 "claim_type": "exclusive_write", 00:16:31.005 "zoned": false, 00:16:31.005 "supported_io_types": { 00:16:31.005 "read": true, 00:16:31.005 "write": true, 00:16:31.005 "unmap": true, 00:16:31.005 "flush": true, 00:16:31.005 "reset": true, 00:16:31.005 "nvme_admin": false, 00:16:31.005 "nvme_io": false, 00:16:31.005 "nvme_io_md": false, 00:16:31.005 "write_zeroes": true, 00:16:31.005 "zcopy": true, 00:16:31.005 "get_zone_info": false, 00:16:31.005 "zone_management": false, 00:16:31.005 "zone_append": false, 00:16:31.005 "compare": false, 00:16:31.005 "compare_and_write": false, 00:16:31.005 "abort": true, 00:16:31.005 "seek_hole": false, 00:16:31.005 "seek_data": false, 00:16:31.005 "copy": true, 00:16:31.005 "nvme_iov_md": false 00:16:31.005 }, 00:16:31.005 "memory_domains": [ 00:16:31.005 { 00:16:31.005 "dma_device_id": "system", 00:16:31.005 "dma_device_type": 1 00:16:31.005 }, 00:16:31.005 { 00:16:31.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.005 "dma_device_type": 2 00:16:31.005 } 00:16:31.005 ], 00:16:31.005 "driver_specific": {} 00:16:31.005 } 00:16:31.005 ] 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.005 00:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.264 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.264 "name": "Existed_Raid", 00:16:31.264 "uuid": "6c6fb497-daa2-4ea0-a85f-741ce65cd71f", 00:16:31.264 "strip_size_kb": 0, 00:16:31.264 "state": "online", 00:16:31.264 "raid_level": "raid1", 00:16:31.264 "superblock": false, 00:16:31.264 "num_base_bdevs": 3, 00:16:31.264 "num_base_bdevs_discovered": 3, 00:16:31.264 "num_base_bdevs_operational": 3, 00:16:31.264 "base_bdevs_list": [ 00:16:31.264 { 00:16:31.264 "name": "NewBaseBdev", 00:16:31.264 "uuid": "d39cb5be-e797-46bb-b934-1e8af75363da", 00:16:31.264 "is_configured": true, 00:16:31.264 "data_offset": 0, 00:16:31.264 "data_size": 65536 00:16:31.264 }, 00:16:31.264 { 00:16:31.264 "name": "BaseBdev2", 00:16:31.264 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:31.264 "is_configured": true, 00:16:31.264 "data_offset": 0, 00:16:31.264 "data_size": 65536 00:16:31.264 }, 00:16:31.264 { 00:16:31.264 "name": "BaseBdev3", 00:16:31.264 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:31.264 "is_configured": true, 00:16:31.264 "data_offset": 0, 00:16:31.264 "data_size": 65536 00:16:31.264 } 00:16:31.264 ] 00:16:31.264 }' 00:16:31.264 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.264 00:11:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.831 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:31.831 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:31.831 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:31.831 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:31.831 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:31.831 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:31.831 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:31.831 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:32.089 [2024-07-16 00:11:18.945950] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:32.089 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:32.089 "name": "Existed_Raid", 00:16:32.089 "aliases": [ 00:16:32.090 "6c6fb497-daa2-4ea0-a85f-741ce65cd71f" 00:16:32.090 ], 00:16:32.090 "product_name": "Raid Volume", 00:16:32.090 "block_size": 512, 00:16:32.090 "num_blocks": 65536, 00:16:32.090 "uuid": "6c6fb497-daa2-4ea0-a85f-741ce65cd71f", 00:16:32.090 "assigned_rate_limits": { 00:16:32.090 "rw_ios_per_sec": 0, 00:16:32.090 "rw_mbytes_per_sec": 0, 00:16:32.090 "r_mbytes_per_sec": 0, 00:16:32.090 "w_mbytes_per_sec": 0 00:16:32.090 }, 00:16:32.090 "claimed": false, 00:16:32.090 "zoned": false, 00:16:32.090 "supported_io_types": { 00:16:32.090 "read": true, 00:16:32.090 "write": true, 00:16:32.090 "unmap": false, 00:16:32.090 "flush": false, 00:16:32.090 "reset": true, 00:16:32.090 "nvme_admin": false, 00:16:32.090 "nvme_io": false, 00:16:32.090 "nvme_io_md": false, 00:16:32.090 "write_zeroes": true, 00:16:32.090 "zcopy": false, 00:16:32.090 "get_zone_info": false, 00:16:32.090 "zone_management": false, 00:16:32.090 "zone_append": false, 00:16:32.090 "compare": false, 00:16:32.090 "compare_and_write": false, 00:16:32.090 "abort": false, 00:16:32.090 "seek_hole": false, 00:16:32.090 "seek_data": false, 00:16:32.090 "copy": false, 00:16:32.090 "nvme_iov_md": false 00:16:32.090 }, 00:16:32.090 "memory_domains": [ 00:16:32.090 { 00:16:32.090 "dma_device_id": "system", 00:16:32.090 "dma_device_type": 1 00:16:32.090 }, 00:16:32.090 { 00:16:32.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.090 "dma_device_type": 2 00:16:32.090 }, 00:16:32.090 { 00:16:32.090 "dma_device_id": "system", 00:16:32.090 "dma_device_type": 1 00:16:32.090 }, 00:16:32.090 { 00:16:32.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.090 "dma_device_type": 2 00:16:32.090 }, 00:16:32.090 { 00:16:32.090 "dma_device_id": "system", 00:16:32.090 "dma_device_type": 1 00:16:32.090 }, 00:16:32.090 { 00:16:32.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.090 "dma_device_type": 2 00:16:32.090 } 00:16:32.090 ], 00:16:32.090 "driver_specific": { 00:16:32.090 "raid": { 00:16:32.090 "uuid": "6c6fb497-daa2-4ea0-a85f-741ce65cd71f", 00:16:32.090 "strip_size_kb": 0, 00:16:32.090 "state": "online", 00:16:32.090 "raid_level": "raid1", 00:16:32.090 "superblock": false, 00:16:32.090 "num_base_bdevs": 3, 00:16:32.090 "num_base_bdevs_discovered": 3, 00:16:32.090 "num_base_bdevs_operational": 3, 00:16:32.090 "base_bdevs_list": [ 00:16:32.090 { 00:16:32.090 "name": "NewBaseBdev", 00:16:32.090 "uuid": "d39cb5be-e797-46bb-b934-1e8af75363da", 00:16:32.090 "is_configured": true, 00:16:32.090 "data_offset": 0, 00:16:32.090 "data_size": 65536 00:16:32.090 }, 00:16:32.090 { 00:16:32.090 "name": "BaseBdev2", 00:16:32.090 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:32.090 "is_configured": true, 00:16:32.090 "data_offset": 0, 00:16:32.090 "data_size": 65536 00:16:32.090 }, 00:16:32.090 { 00:16:32.090 "name": "BaseBdev3", 00:16:32.090 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:32.090 "is_configured": true, 00:16:32.090 "data_offset": 0, 00:16:32.090 "data_size": 65536 00:16:32.090 } 00:16:32.090 ] 00:16:32.090 } 00:16:32.090 } 00:16:32.090 }' 00:16:32.090 00:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:32.090 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:32.090 BaseBdev2 00:16:32.090 BaseBdev3' 00:16:32.090 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.090 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:32.090 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:32.349 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:32.349 "name": "NewBaseBdev", 00:16:32.349 "aliases": [ 00:16:32.349 "d39cb5be-e797-46bb-b934-1e8af75363da" 00:16:32.349 ], 00:16:32.349 "product_name": "Malloc disk", 00:16:32.349 "block_size": 512, 00:16:32.349 "num_blocks": 65536, 00:16:32.349 "uuid": "d39cb5be-e797-46bb-b934-1e8af75363da", 00:16:32.349 "assigned_rate_limits": { 00:16:32.349 "rw_ios_per_sec": 0, 00:16:32.349 "rw_mbytes_per_sec": 0, 00:16:32.349 "r_mbytes_per_sec": 0, 00:16:32.349 "w_mbytes_per_sec": 0 00:16:32.349 }, 00:16:32.349 "claimed": true, 00:16:32.349 "claim_type": "exclusive_write", 00:16:32.349 "zoned": false, 00:16:32.349 "supported_io_types": { 00:16:32.349 "read": true, 00:16:32.349 "write": true, 00:16:32.349 "unmap": true, 00:16:32.349 "flush": true, 00:16:32.349 "reset": true, 00:16:32.349 "nvme_admin": false, 00:16:32.349 "nvme_io": false, 00:16:32.349 "nvme_io_md": false, 00:16:32.349 "write_zeroes": true, 00:16:32.349 "zcopy": true, 00:16:32.349 "get_zone_info": false, 00:16:32.349 "zone_management": false, 00:16:32.349 "zone_append": false, 00:16:32.349 "compare": false, 00:16:32.349 "compare_and_write": false, 00:16:32.349 "abort": true, 00:16:32.349 "seek_hole": false, 00:16:32.349 "seek_data": false, 00:16:32.349 "copy": true, 00:16:32.349 "nvme_iov_md": false 00:16:32.349 }, 00:16:32.349 "memory_domains": [ 00:16:32.349 { 00:16:32.349 "dma_device_id": "system", 00:16:32.349 "dma_device_type": 1 00:16:32.349 }, 00:16:32.349 { 00:16:32.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.349 "dma_device_type": 2 00:16:32.349 } 00:16:32.349 ], 00:16:32.349 "driver_specific": {} 00:16:32.349 }' 00:16:32.349 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.607 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.607 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:32.607 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.607 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.607 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:32.607 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.607 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.607 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:32.607 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.864 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.864 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:32.864 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.864 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:32.864 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.125 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.125 "name": "BaseBdev2", 00:16:33.125 "aliases": [ 00:16:33.125 "fca72df9-71df-4e6f-8817-39e484a06478" 00:16:33.125 ], 00:16:33.125 "product_name": "Malloc disk", 00:16:33.125 "block_size": 512, 00:16:33.125 "num_blocks": 65536, 00:16:33.125 "uuid": "fca72df9-71df-4e6f-8817-39e484a06478", 00:16:33.125 "assigned_rate_limits": { 00:16:33.125 "rw_ios_per_sec": 0, 00:16:33.125 "rw_mbytes_per_sec": 0, 00:16:33.125 "r_mbytes_per_sec": 0, 00:16:33.125 "w_mbytes_per_sec": 0 00:16:33.125 }, 00:16:33.125 "claimed": true, 00:16:33.125 "claim_type": "exclusive_write", 00:16:33.125 "zoned": false, 00:16:33.125 "supported_io_types": { 00:16:33.125 "read": true, 00:16:33.125 "write": true, 00:16:33.125 "unmap": true, 00:16:33.125 "flush": true, 00:16:33.125 "reset": true, 00:16:33.125 "nvme_admin": false, 00:16:33.125 "nvme_io": false, 00:16:33.125 "nvme_io_md": false, 00:16:33.125 "write_zeroes": true, 00:16:33.125 "zcopy": true, 00:16:33.125 "get_zone_info": false, 00:16:33.125 "zone_management": false, 00:16:33.125 "zone_append": false, 00:16:33.125 "compare": false, 00:16:33.125 "compare_and_write": false, 00:16:33.125 "abort": true, 00:16:33.125 "seek_hole": false, 00:16:33.125 "seek_data": false, 00:16:33.125 "copy": true, 00:16:33.125 "nvme_iov_md": false 00:16:33.125 }, 00:16:33.125 "memory_domains": [ 00:16:33.125 { 00:16:33.125 "dma_device_id": "system", 00:16:33.125 "dma_device_type": 1 00:16:33.125 }, 00:16:33.125 { 00:16:33.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.125 "dma_device_type": 2 00:16:33.125 } 00:16:33.125 ], 00:16:33.125 "driver_specific": {} 00:16:33.125 }' 00:16:33.125 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.125 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.125 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.125 00:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.125 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.125 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:33.125 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.384 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.384 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:33.384 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.384 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.384 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:33.384 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:33.384 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:33.384 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.643 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.643 "name": "BaseBdev3", 00:16:33.643 "aliases": [ 00:16:33.643 "1fdab84f-0dc4-476b-8228-a71164258c65" 00:16:33.643 ], 00:16:33.643 "product_name": "Malloc disk", 00:16:33.643 "block_size": 512, 00:16:33.643 "num_blocks": 65536, 00:16:33.643 "uuid": "1fdab84f-0dc4-476b-8228-a71164258c65", 00:16:33.643 "assigned_rate_limits": { 00:16:33.643 "rw_ios_per_sec": 0, 00:16:33.643 "rw_mbytes_per_sec": 0, 00:16:33.643 "r_mbytes_per_sec": 0, 00:16:33.643 "w_mbytes_per_sec": 0 00:16:33.643 }, 00:16:33.643 "claimed": true, 00:16:33.643 "claim_type": "exclusive_write", 00:16:33.643 "zoned": false, 00:16:33.643 "supported_io_types": { 00:16:33.643 "read": true, 00:16:33.643 "write": true, 00:16:33.643 "unmap": true, 00:16:33.643 "flush": true, 00:16:33.643 "reset": true, 00:16:33.643 "nvme_admin": false, 00:16:33.643 "nvme_io": false, 00:16:33.643 "nvme_io_md": false, 00:16:33.643 "write_zeroes": true, 00:16:33.643 "zcopy": true, 00:16:33.643 "get_zone_info": false, 00:16:33.643 "zone_management": false, 00:16:33.643 "zone_append": false, 00:16:33.643 "compare": false, 00:16:33.643 "compare_and_write": false, 00:16:33.643 "abort": true, 00:16:33.643 "seek_hole": false, 00:16:33.643 "seek_data": false, 00:16:33.643 "copy": true, 00:16:33.643 "nvme_iov_md": false 00:16:33.643 }, 00:16:33.643 "memory_domains": [ 00:16:33.643 { 00:16:33.643 "dma_device_id": "system", 00:16:33.643 "dma_device_type": 1 00:16:33.643 }, 00:16:33.643 { 00:16:33.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.643 "dma_device_type": 2 00:16:33.643 } 00:16:33.643 ], 00:16:33.643 "driver_specific": {} 00:16:33.643 }' 00:16:33.643 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.643 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.643 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.643 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.643 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.902 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:33.902 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.902 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.902 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:33.902 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.902 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.902 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:33.902 00:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:34.161 [2024-07-16 00:11:21.007165] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:34.161 [2024-07-16 00:11:21.007193] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:34.161 [2024-07-16 00:11:21.007246] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:34.161 [2024-07-16 00:11:21.007502] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:34.161 [2024-07-16 00:11:21.007513] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24cce40 name Existed_Raid, state offline 00:16:34.161 00:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3535359 00:16:34.161 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3535359 ']' 00:16:34.161 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3535359 00:16:34.161 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:34.161 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:34.161 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3535359 00:16:34.161 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:34.161 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:34.161 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3535359' 00:16:34.161 killing process with pid 3535359 00:16:34.162 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3535359 00:16:34.162 [2024-07-16 00:11:21.081645] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:34.162 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3535359 00:16:34.162 [2024-07-16 00:11:21.108471] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:34.421 00:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:34.421 00:16:34.421 real 0m30.975s 00:16:34.421 user 0m56.852s 00:16:34.421 sys 0m5.499s 00:16:34.421 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:34.421 00:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.421 ************************************ 00:16:34.421 END TEST raid_state_function_test 00:16:34.421 ************************************ 00:16:34.681 00:11:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:34.681 00:11:21 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:16:34.681 00:11:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:34.681 00:11:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:34.681 00:11:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:34.681 ************************************ 00:16:34.681 START TEST raid_state_function_test_sb 00:16:34.681 ************************************ 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3539908 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3539908' 00:16:34.681 Process raid pid: 3539908 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3539908 /var/tmp/spdk-raid.sock 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3539908 ']' 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:34.681 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:34.681 00:11:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:34.681 [2024-07-16 00:11:21.493609] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:16:34.681 [2024-07-16 00:11:21.493684] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:34.681 [2024-07-16 00:11:21.623307] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:34.941 [2024-07-16 00:11:21.730960] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:34.941 [2024-07-16 00:11:21.795386] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:34.941 [2024-07-16 00:11:21.795416] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:35.509 00:11:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:35.509 00:11:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:35.509 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:35.768 [2024-07-16 00:11:22.642790] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:35.768 [2024-07-16 00:11:22.642832] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:35.768 [2024-07-16 00:11:22.642843] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:35.768 [2024-07-16 00:11:22.642855] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:35.768 [2024-07-16 00:11:22.642864] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:35.768 [2024-07-16 00:11:22.642875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.768 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.027 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.027 "name": "Existed_Raid", 00:16:36.027 "uuid": "53b5c0f2-4497-4ca2-88f1-2304f5252e31", 00:16:36.027 "strip_size_kb": 0, 00:16:36.027 "state": "configuring", 00:16:36.027 "raid_level": "raid1", 00:16:36.027 "superblock": true, 00:16:36.027 "num_base_bdevs": 3, 00:16:36.027 "num_base_bdevs_discovered": 0, 00:16:36.027 "num_base_bdevs_operational": 3, 00:16:36.027 "base_bdevs_list": [ 00:16:36.027 { 00:16:36.027 "name": "BaseBdev1", 00:16:36.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.027 "is_configured": false, 00:16:36.027 "data_offset": 0, 00:16:36.027 "data_size": 0 00:16:36.027 }, 00:16:36.027 { 00:16:36.027 "name": "BaseBdev2", 00:16:36.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.027 "is_configured": false, 00:16:36.027 "data_offset": 0, 00:16:36.027 "data_size": 0 00:16:36.027 }, 00:16:36.027 { 00:16:36.027 "name": "BaseBdev3", 00:16:36.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.027 "is_configured": false, 00:16:36.027 "data_offset": 0, 00:16:36.027 "data_size": 0 00:16:36.027 } 00:16:36.027 ] 00:16:36.027 }' 00:16:36.027 00:11:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.027 00:11:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:36.599 00:11:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:36.929 [2024-07-16 00:11:23.713488] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:36.929 [2024-07-16 00:11:23.713518] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1011a80 name Existed_Raid, state configuring 00:16:36.929 00:11:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:37.187 [2024-07-16 00:11:23.958159] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:37.187 [2024-07-16 00:11:23.958186] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:37.187 [2024-07-16 00:11:23.958196] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:37.187 [2024-07-16 00:11:23.958207] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:37.187 [2024-07-16 00:11:23.958216] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:37.187 [2024-07-16 00:11:23.958227] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:37.187 00:11:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:37.445 [2024-07-16 00:11:24.212661] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:37.445 BaseBdev1 00:16:37.445 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:37.445 00:11:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:37.445 00:11:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:37.445 00:11:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:37.445 00:11:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:37.445 00:11:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:37.445 00:11:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:37.703 00:11:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:37.961 [ 00:16:37.961 { 00:16:37.961 "name": "BaseBdev1", 00:16:37.961 "aliases": [ 00:16:37.961 "1be57119-2d27-4cbc-ada2-3aa779365451" 00:16:37.961 ], 00:16:37.961 "product_name": "Malloc disk", 00:16:37.961 "block_size": 512, 00:16:37.961 "num_blocks": 65536, 00:16:37.961 "uuid": "1be57119-2d27-4cbc-ada2-3aa779365451", 00:16:37.961 "assigned_rate_limits": { 00:16:37.961 "rw_ios_per_sec": 0, 00:16:37.961 "rw_mbytes_per_sec": 0, 00:16:37.961 "r_mbytes_per_sec": 0, 00:16:37.961 "w_mbytes_per_sec": 0 00:16:37.961 }, 00:16:37.961 "claimed": true, 00:16:37.961 "claim_type": "exclusive_write", 00:16:37.961 "zoned": false, 00:16:37.961 "supported_io_types": { 00:16:37.961 "read": true, 00:16:37.961 "write": true, 00:16:37.961 "unmap": true, 00:16:37.961 "flush": true, 00:16:37.961 "reset": true, 00:16:37.961 "nvme_admin": false, 00:16:37.961 "nvme_io": false, 00:16:37.961 "nvme_io_md": false, 00:16:37.961 "write_zeroes": true, 00:16:37.961 "zcopy": true, 00:16:37.961 "get_zone_info": false, 00:16:37.961 "zone_management": false, 00:16:37.961 "zone_append": false, 00:16:37.961 "compare": false, 00:16:37.961 "compare_and_write": false, 00:16:37.961 "abort": true, 00:16:37.961 "seek_hole": false, 00:16:37.961 "seek_data": false, 00:16:37.961 "copy": true, 00:16:37.961 "nvme_iov_md": false 00:16:37.961 }, 00:16:37.961 "memory_domains": [ 00:16:37.961 { 00:16:37.961 "dma_device_id": "system", 00:16:37.961 "dma_device_type": 1 00:16:37.961 }, 00:16:37.961 { 00:16:37.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.961 "dma_device_type": 2 00:16:37.961 } 00:16:37.961 ], 00:16:37.961 "driver_specific": {} 00:16:37.961 } 00:16:37.961 ] 00:16:37.961 00:11:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:37.961 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:37.961 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.961 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:37.961 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:37.961 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:37.961 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:37.961 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.962 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.962 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.962 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.962 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.962 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.962 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.962 "name": "Existed_Raid", 00:16:37.962 "uuid": "2af1a2c6-9fe2-4deb-9c3c-d9d892e575ec", 00:16:37.962 "strip_size_kb": 0, 00:16:37.962 "state": "configuring", 00:16:37.962 "raid_level": "raid1", 00:16:37.962 "superblock": true, 00:16:37.962 "num_base_bdevs": 3, 00:16:37.962 "num_base_bdevs_discovered": 1, 00:16:37.962 "num_base_bdevs_operational": 3, 00:16:37.962 "base_bdevs_list": [ 00:16:37.962 { 00:16:37.962 "name": "BaseBdev1", 00:16:37.962 "uuid": "1be57119-2d27-4cbc-ada2-3aa779365451", 00:16:37.962 "is_configured": true, 00:16:37.962 "data_offset": 2048, 00:16:37.962 "data_size": 63488 00:16:37.962 }, 00:16:37.962 { 00:16:37.962 "name": "BaseBdev2", 00:16:37.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.962 "is_configured": false, 00:16:37.962 "data_offset": 0, 00:16:37.962 "data_size": 0 00:16:37.962 }, 00:16:37.962 { 00:16:37.962 "name": "BaseBdev3", 00:16:37.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.962 "is_configured": false, 00:16:37.962 "data_offset": 0, 00:16:37.962 "data_size": 0 00:16:37.962 } 00:16:37.962 ] 00:16:37.962 }' 00:16:37.962 00:11:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:38.220 00:11:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:38.784 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:38.784 [2024-07-16 00:11:25.716767] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:38.784 [2024-07-16 00:11:25.716802] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1011310 name Existed_Raid, state configuring 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:39.042 [2024-07-16 00:11:25.965458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:39.042 [2024-07-16 00:11:25.966880] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:39.042 [2024-07-16 00:11:25.966913] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:39.042 [2024-07-16 00:11:25.966923] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:39.042 [2024-07-16 00:11:25.966941] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:39.042 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:39.300 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.300 00:11:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.300 00:11:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.300 "name": "Existed_Raid", 00:16:39.300 "uuid": "e005d7e8-28b2-4575-a8b8-0f2854426145", 00:16:39.300 "strip_size_kb": 0, 00:16:39.300 "state": "configuring", 00:16:39.300 "raid_level": "raid1", 00:16:39.300 "superblock": true, 00:16:39.300 "num_base_bdevs": 3, 00:16:39.300 "num_base_bdevs_discovered": 1, 00:16:39.300 "num_base_bdevs_operational": 3, 00:16:39.300 "base_bdevs_list": [ 00:16:39.300 { 00:16:39.300 "name": "BaseBdev1", 00:16:39.300 "uuid": "1be57119-2d27-4cbc-ada2-3aa779365451", 00:16:39.300 "is_configured": true, 00:16:39.300 "data_offset": 2048, 00:16:39.300 "data_size": 63488 00:16:39.300 }, 00:16:39.300 { 00:16:39.300 "name": "BaseBdev2", 00:16:39.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.300 "is_configured": false, 00:16:39.300 "data_offset": 0, 00:16:39.300 "data_size": 0 00:16:39.300 }, 00:16:39.300 { 00:16:39.300 "name": "BaseBdev3", 00:16:39.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.300 "is_configured": false, 00:16:39.300 "data_offset": 0, 00:16:39.300 "data_size": 0 00:16:39.300 } 00:16:39.300 ] 00:16:39.300 }' 00:16:39.300 00:11:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.300 00:11:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:40.234 00:11:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:40.234 [2024-07-16 00:11:27.067851] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:40.234 BaseBdev2 00:16:40.234 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:40.234 00:11:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:40.234 00:11:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:40.234 00:11:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:40.234 00:11:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:40.234 00:11:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:40.234 00:11:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:40.493 00:11:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:40.753 [ 00:16:40.753 { 00:16:40.753 "name": "BaseBdev2", 00:16:40.753 "aliases": [ 00:16:40.753 "820e450b-eff2-4b2d-8909-a9c6c2f2a78b" 00:16:40.753 ], 00:16:40.753 "product_name": "Malloc disk", 00:16:40.753 "block_size": 512, 00:16:40.753 "num_blocks": 65536, 00:16:40.753 "uuid": "820e450b-eff2-4b2d-8909-a9c6c2f2a78b", 00:16:40.753 "assigned_rate_limits": { 00:16:40.753 "rw_ios_per_sec": 0, 00:16:40.753 "rw_mbytes_per_sec": 0, 00:16:40.753 "r_mbytes_per_sec": 0, 00:16:40.753 "w_mbytes_per_sec": 0 00:16:40.753 }, 00:16:40.753 "claimed": true, 00:16:40.753 "claim_type": "exclusive_write", 00:16:40.753 "zoned": false, 00:16:40.753 "supported_io_types": { 00:16:40.753 "read": true, 00:16:40.753 "write": true, 00:16:40.753 "unmap": true, 00:16:40.753 "flush": true, 00:16:40.753 "reset": true, 00:16:40.753 "nvme_admin": false, 00:16:40.753 "nvme_io": false, 00:16:40.753 "nvme_io_md": false, 00:16:40.753 "write_zeroes": true, 00:16:40.753 "zcopy": true, 00:16:40.753 "get_zone_info": false, 00:16:40.753 "zone_management": false, 00:16:40.753 "zone_append": false, 00:16:40.753 "compare": false, 00:16:40.753 "compare_and_write": false, 00:16:40.753 "abort": true, 00:16:40.753 "seek_hole": false, 00:16:40.753 "seek_data": false, 00:16:40.753 "copy": true, 00:16:40.753 "nvme_iov_md": false 00:16:40.753 }, 00:16:40.753 "memory_domains": [ 00:16:40.753 { 00:16:40.753 "dma_device_id": "system", 00:16:40.753 "dma_device_type": 1 00:16:40.753 }, 00:16:40.753 { 00:16:40.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.753 "dma_device_type": 2 00:16:40.753 } 00:16:40.753 ], 00:16:40.753 "driver_specific": {} 00:16:40.753 } 00:16:40.753 ] 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.753 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.012 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.012 "name": "Existed_Raid", 00:16:41.012 "uuid": "e005d7e8-28b2-4575-a8b8-0f2854426145", 00:16:41.012 "strip_size_kb": 0, 00:16:41.012 "state": "configuring", 00:16:41.012 "raid_level": "raid1", 00:16:41.012 "superblock": true, 00:16:41.012 "num_base_bdevs": 3, 00:16:41.012 "num_base_bdevs_discovered": 2, 00:16:41.012 "num_base_bdevs_operational": 3, 00:16:41.012 "base_bdevs_list": [ 00:16:41.012 { 00:16:41.012 "name": "BaseBdev1", 00:16:41.012 "uuid": "1be57119-2d27-4cbc-ada2-3aa779365451", 00:16:41.012 "is_configured": true, 00:16:41.012 "data_offset": 2048, 00:16:41.012 "data_size": 63488 00:16:41.012 }, 00:16:41.012 { 00:16:41.012 "name": "BaseBdev2", 00:16:41.012 "uuid": "820e450b-eff2-4b2d-8909-a9c6c2f2a78b", 00:16:41.012 "is_configured": true, 00:16:41.012 "data_offset": 2048, 00:16:41.012 "data_size": 63488 00:16:41.012 }, 00:16:41.012 { 00:16:41.012 "name": "BaseBdev3", 00:16:41.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:41.012 "is_configured": false, 00:16:41.012 "data_offset": 0, 00:16:41.012 "data_size": 0 00:16:41.012 } 00:16:41.012 ] 00:16:41.012 }' 00:16:41.012 00:11:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.012 00:11:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:41.580 00:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:41.839 [2024-07-16 00:11:28.615405] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:41.839 [2024-07-16 00:11:28.615561] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1012400 00:16:41.839 [2024-07-16 00:11:28.615575] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:41.839 [2024-07-16 00:11:28.615748] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1011ef0 00:16:41.839 [2024-07-16 00:11:28.615864] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1012400 00:16:41.839 [2024-07-16 00:11:28.615874] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1012400 00:16:41.839 [2024-07-16 00:11:28.615971] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:41.839 BaseBdev3 00:16:41.839 00:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:41.839 00:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:41.839 00:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:41.839 00:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:41.839 00:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:41.839 00:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:41.839 00:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.098 00:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:42.357 [ 00:16:42.357 { 00:16:42.357 "name": "BaseBdev3", 00:16:42.357 "aliases": [ 00:16:42.357 "0ad1d1e9-1643-4f06-8846-6e8aefd9cde0" 00:16:42.357 ], 00:16:42.357 "product_name": "Malloc disk", 00:16:42.357 "block_size": 512, 00:16:42.357 "num_blocks": 65536, 00:16:42.357 "uuid": "0ad1d1e9-1643-4f06-8846-6e8aefd9cde0", 00:16:42.357 "assigned_rate_limits": { 00:16:42.357 "rw_ios_per_sec": 0, 00:16:42.357 "rw_mbytes_per_sec": 0, 00:16:42.357 "r_mbytes_per_sec": 0, 00:16:42.357 "w_mbytes_per_sec": 0 00:16:42.357 }, 00:16:42.357 "claimed": true, 00:16:42.357 "claim_type": "exclusive_write", 00:16:42.357 "zoned": false, 00:16:42.357 "supported_io_types": { 00:16:42.357 "read": true, 00:16:42.357 "write": true, 00:16:42.357 "unmap": true, 00:16:42.357 "flush": true, 00:16:42.357 "reset": true, 00:16:42.357 "nvme_admin": false, 00:16:42.357 "nvme_io": false, 00:16:42.357 "nvme_io_md": false, 00:16:42.357 "write_zeroes": true, 00:16:42.357 "zcopy": true, 00:16:42.357 "get_zone_info": false, 00:16:42.357 "zone_management": false, 00:16:42.357 "zone_append": false, 00:16:42.357 "compare": false, 00:16:42.357 "compare_and_write": false, 00:16:42.357 "abort": true, 00:16:42.358 "seek_hole": false, 00:16:42.358 "seek_data": false, 00:16:42.358 "copy": true, 00:16:42.358 "nvme_iov_md": false 00:16:42.358 }, 00:16:42.358 "memory_domains": [ 00:16:42.358 { 00:16:42.358 "dma_device_id": "system", 00:16:42.358 "dma_device_type": 1 00:16:42.358 }, 00:16:42.358 { 00:16:42.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.358 "dma_device_type": 2 00:16:42.358 } 00:16:42.358 ], 00:16:42.358 "driver_specific": {} 00:16:42.358 } 00:16:42.358 ] 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.358 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.617 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.617 "name": "Existed_Raid", 00:16:42.617 "uuid": "e005d7e8-28b2-4575-a8b8-0f2854426145", 00:16:42.617 "strip_size_kb": 0, 00:16:42.617 "state": "online", 00:16:42.617 "raid_level": "raid1", 00:16:42.617 "superblock": true, 00:16:42.617 "num_base_bdevs": 3, 00:16:42.617 "num_base_bdevs_discovered": 3, 00:16:42.617 "num_base_bdevs_operational": 3, 00:16:42.617 "base_bdevs_list": [ 00:16:42.617 { 00:16:42.617 "name": "BaseBdev1", 00:16:42.617 "uuid": "1be57119-2d27-4cbc-ada2-3aa779365451", 00:16:42.617 "is_configured": true, 00:16:42.617 "data_offset": 2048, 00:16:42.617 "data_size": 63488 00:16:42.618 }, 00:16:42.618 { 00:16:42.618 "name": "BaseBdev2", 00:16:42.618 "uuid": "820e450b-eff2-4b2d-8909-a9c6c2f2a78b", 00:16:42.618 "is_configured": true, 00:16:42.618 "data_offset": 2048, 00:16:42.618 "data_size": 63488 00:16:42.618 }, 00:16:42.618 { 00:16:42.618 "name": "BaseBdev3", 00:16:42.618 "uuid": "0ad1d1e9-1643-4f06-8846-6e8aefd9cde0", 00:16:42.618 "is_configured": true, 00:16:42.618 "data_offset": 2048, 00:16:42.618 "data_size": 63488 00:16:42.618 } 00:16:42.618 ] 00:16:42.618 }' 00:16:42.618 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.618 00:11:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:43.186 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:43.186 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:43.186 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:43.186 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:43.186 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:43.186 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:43.186 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:43.186 00:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:43.445 [2024-07-16 00:11:30.139772] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:43.445 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:43.445 "name": "Existed_Raid", 00:16:43.445 "aliases": [ 00:16:43.445 "e005d7e8-28b2-4575-a8b8-0f2854426145" 00:16:43.445 ], 00:16:43.445 "product_name": "Raid Volume", 00:16:43.445 "block_size": 512, 00:16:43.445 "num_blocks": 63488, 00:16:43.445 "uuid": "e005d7e8-28b2-4575-a8b8-0f2854426145", 00:16:43.445 "assigned_rate_limits": { 00:16:43.445 "rw_ios_per_sec": 0, 00:16:43.445 "rw_mbytes_per_sec": 0, 00:16:43.445 "r_mbytes_per_sec": 0, 00:16:43.445 "w_mbytes_per_sec": 0 00:16:43.445 }, 00:16:43.445 "claimed": false, 00:16:43.445 "zoned": false, 00:16:43.445 "supported_io_types": { 00:16:43.445 "read": true, 00:16:43.445 "write": true, 00:16:43.445 "unmap": false, 00:16:43.445 "flush": false, 00:16:43.445 "reset": true, 00:16:43.445 "nvme_admin": false, 00:16:43.445 "nvme_io": false, 00:16:43.445 "nvme_io_md": false, 00:16:43.445 "write_zeroes": true, 00:16:43.445 "zcopy": false, 00:16:43.445 "get_zone_info": false, 00:16:43.445 "zone_management": false, 00:16:43.445 "zone_append": false, 00:16:43.445 "compare": false, 00:16:43.445 "compare_and_write": false, 00:16:43.445 "abort": false, 00:16:43.445 "seek_hole": false, 00:16:43.445 "seek_data": false, 00:16:43.445 "copy": false, 00:16:43.445 "nvme_iov_md": false 00:16:43.445 }, 00:16:43.445 "memory_domains": [ 00:16:43.445 { 00:16:43.445 "dma_device_id": "system", 00:16:43.445 "dma_device_type": 1 00:16:43.445 }, 00:16:43.445 { 00:16:43.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.445 "dma_device_type": 2 00:16:43.445 }, 00:16:43.445 { 00:16:43.445 "dma_device_id": "system", 00:16:43.445 "dma_device_type": 1 00:16:43.445 }, 00:16:43.445 { 00:16:43.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.445 "dma_device_type": 2 00:16:43.445 }, 00:16:43.445 { 00:16:43.445 "dma_device_id": "system", 00:16:43.445 "dma_device_type": 1 00:16:43.445 }, 00:16:43.445 { 00:16:43.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.445 "dma_device_type": 2 00:16:43.445 } 00:16:43.445 ], 00:16:43.445 "driver_specific": { 00:16:43.445 "raid": { 00:16:43.445 "uuid": "e005d7e8-28b2-4575-a8b8-0f2854426145", 00:16:43.445 "strip_size_kb": 0, 00:16:43.445 "state": "online", 00:16:43.445 "raid_level": "raid1", 00:16:43.445 "superblock": true, 00:16:43.445 "num_base_bdevs": 3, 00:16:43.445 "num_base_bdevs_discovered": 3, 00:16:43.445 "num_base_bdevs_operational": 3, 00:16:43.445 "base_bdevs_list": [ 00:16:43.445 { 00:16:43.445 "name": "BaseBdev1", 00:16:43.445 "uuid": "1be57119-2d27-4cbc-ada2-3aa779365451", 00:16:43.445 "is_configured": true, 00:16:43.445 "data_offset": 2048, 00:16:43.445 "data_size": 63488 00:16:43.445 }, 00:16:43.445 { 00:16:43.445 "name": "BaseBdev2", 00:16:43.445 "uuid": "820e450b-eff2-4b2d-8909-a9c6c2f2a78b", 00:16:43.445 "is_configured": true, 00:16:43.445 "data_offset": 2048, 00:16:43.445 "data_size": 63488 00:16:43.445 }, 00:16:43.445 { 00:16:43.445 "name": "BaseBdev3", 00:16:43.445 "uuid": "0ad1d1e9-1643-4f06-8846-6e8aefd9cde0", 00:16:43.445 "is_configured": true, 00:16:43.445 "data_offset": 2048, 00:16:43.445 "data_size": 63488 00:16:43.445 } 00:16:43.445 ] 00:16:43.445 } 00:16:43.445 } 00:16:43.445 }' 00:16:43.446 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:43.446 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:43.446 BaseBdev2 00:16:43.446 BaseBdev3' 00:16:43.446 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:43.446 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:43.446 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:43.705 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:43.705 "name": "BaseBdev1", 00:16:43.705 "aliases": [ 00:16:43.705 "1be57119-2d27-4cbc-ada2-3aa779365451" 00:16:43.705 ], 00:16:43.705 "product_name": "Malloc disk", 00:16:43.705 "block_size": 512, 00:16:43.705 "num_blocks": 65536, 00:16:43.705 "uuid": "1be57119-2d27-4cbc-ada2-3aa779365451", 00:16:43.705 "assigned_rate_limits": { 00:16:43.705 "rw_ios_per_sec": 0, 00:16:43.705 "rw_mbytes_per_sec": 0, 00:16:43.705 "r_mbytes_per_sec": 0, 00:16:43.705 "w_mbytes_per_sec": 0 00:16:43.705 }, 00:16:43.705 "claimed": true, 00:16:43.705 "claim_type": "exclusive_write", 00:16:43.705 "zoned": false, 00:16:43.705 "supported_io_types": { 00:16:43.705 "read": true, 00:16:43.705 "write": true, 00:16:43.705 "unmap": true, 00:16:43.705 "flush": true, 00:16:43.705 "reset": true, 00:16:43.705 "nvme_admin": false, 00:16:43.705 "nvme_io": false, 00:16:43.705 "nvme_io_md": false, 00:16:43.705 "write_zeroes": true, 00:16:43.705 "zcopy": true, 00:16:43.705 "get_zone_info": false, 00:16:43.705 "zone_management": false, 00:16:43.705 "zone_append": false, 00:16:43.705 "compare": false, 00:16:43.705 "compare_and_write": false, 00:16:43.705 "abort": true, 00:16:43.705 "seek_hole": false, 00:16:43.705 "seek_data": false, 00:16:43.705 "copy": true, 00:16:43.705 "nvme_iov_md": false 00:16:43.705 }, 00:16:43.705 "memory_domains": [ 00:16:43.705 { 00:16:43.705 "dma_device_id": "system", 00:16:43.705 "dma_device_type": 1 00:16:43.705 }, 00:16:43.705 { 00:16:43.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.705 "dma_device_type": 2 00:16:43.705 } 00:16:43.705 ], 00:16:43.705 "driver_specific": {} 00:16:43.705 }' 00:16:43.705 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:43.705 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:43.705 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:43.705 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:43.705 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:43.705 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:43.705 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:43.964 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:43.964 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:43.964 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:43.964 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:43.964 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:43.964 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:43.964 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:43.964 00:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:44.222 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:44.222 "name": "BaseBdev2", 00:16:44.222 "aliases": [ 00:16:44.222 "820e450b-eff2-4b2d-8909-a9c6c2f2a78b" 00:16:44.222 ], 00:16:44.222 "product_name": "Malloc disk", 00:16:44.222 "block_size": 512, 00:16:44.222 "num_blocks": 65536, 00:16:44.222 "uuid": "820e450b-eff2-4b2d-8909-a9c6c2f2a78b", 00:16:44.222 "assigned_rate_limits": { 00:16:44.222 "rw_ios_per_sec": 0, 00:16:44.222 "rw_mbytes_per_sec": 0, 00:16:44.222 "r_mbytes_per_sec": 0, 00:16:44.222 "w_mbytes_per_sec": 0 00:16:44.222 }, 00:16:44.222 "claimed": true, 00:16:44.222 "claim_type": "exclusive_write", 00:16:44.222 "zoned": false, 00:16:44.222 "supported_io_types": { 00:16:44.222 "read": true, 00:16:44.222 "write": true, 00:16:44.222 "unmap": true, 00:16:44.222 "flush": true, 00:16:44.222 "reset": true, 00:16:44.222 "nvme_admin": false, 00:16:44.222 "nvme_io": false, 00:16:44.222 "nvme_io_md": false, 00:16:44.222 "write_zeroes": true, 00:16:44.222 "zcopy": true, 00:16:44.222 "get_zone_info": false, 00:16:44.222 "zone_management": false, 00:16:44.222 "zone_append": false, 00:16:44.222 "compare": false, 00:16:44.222 "compare_and_write": false, 00:16:44.222 "abort": true, 00:16:44.222 "seek_hole": false, 00:16:44.222 "seek_data": false, 00:16:44.222 "copy": true, 00:16:44.222 "nvme_iov_md": false 00:16:44.222 }, 00:16:44.222 "memory_domains": [ 00:16:44.222 { 00:16:44.222 "dma_device_id": "system", 00:16:44.222 "dma_device_type": 1 00:16:44.222 }, 00:16:44.222 { 00:16:44.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.222 "dma_device_type": 2 00:16:44.222 } 00:16:44.222 ], 00:16:44.222 "driver_specific": {} 00:16:44.222 }' 00:16:44.222 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.222 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.222 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:44.223 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:44.481 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:44.740 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:44.740 "name": "BaseBdev3", 00:16:44.740 "aliases": [ 00:16:44.740 "0ad1d1e9-1643-4f06-8846-6e8aefd9cde0" 00:16:44.740 ], 00:16:44.740 "product_name": "Malloc disk", 00:16:44.740 "block_size": 512, 00:16:44.740 "num_blocks": 65536, 00:16:44.740 "uuid": "0ad1d1e9-1643-4f06-8846-6e8aefd9cde0", 00:16:44.740 "assigned_rate_limits": { 00:16:44.740 "rw_ios_per_sec": 0, 00:16:44.740 "rw_mbytes_per_sec": 0, 00:16:44.740 "r_mbytes_per_sec": 0, 00:16:44.740 "w_mbytes_per_sec": 0 00:16:44.740 }, 00:16:44.740 "claimed": true, 00:16:44.740 "claim_type": "exclusive_write", 00:16:44.740 "zoned": false, 00:16:44.740 "supported_io_types": { 00:16:44.740 "read": true, 00:16:44.740 "write": true, 00:16:44.740 "unmap": true, 00:16:44.740 "flush": true, 00:16:44.740 "reset": true, 00:16:44.740 "nvme_admin": false, 00:16:44.740 "nvme_io": false, 00:16:44.740 "nvme_io_md": false, 00:16:44.741 "write_zeroes": true, 00:16:44.741 "zcopy": true, 00:16:44.741 "get_zone_info": false, 00:16:44.741 "zone_management": false, 00:16:44.741 "zone_append": false, 00:16:44.741 "compare": false, 00:16:44.741 "compare_and_write": false, 00:16:44.741 "abort": true, 00:16:44.741 "seek_hole": false, 00:16:44.741 "seek_data": false, 00:16:44.741 "copy": true, 00:16:44.741 "nvme_iov_md": false 00:16:44.741 }, 00:16:44.741 "memory_domains": [ 00:16:44.741 { 00:16:44.741 "dma_device_id": "system", 00:16:44.741 "dma_device_type": 1 00:16:44.741 }, 00:16:44.741 { 00:16:44.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.741 "dma_device_type": 2 00:16:44.741 } 00:16:44.741 ], 00:16:44.741 "driver_specific": {} 00:16:44.741 }' 00:16:44.741 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:45.000 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:45.000 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:45.000 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.000 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.000 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:45.000 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.000 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.000 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:45.000 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.259 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.259 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:45.259 00:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:45.519 [2024-07-16 00:11:32.225091] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.519 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.778 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.778 "name": "Existed_Raid", 00:16:45.778 "uuid": "e005d7e8-28b2-4575-a8b8-0f2854426145", 00:16:45.778 "strip_size_kb": 0, 00:16:45.778 "state": "online", 00:16:45.778 "raid_level": "raid1", 00:16:45.778 "superblock": true, 00:16:45.778 "num_base_bdevs": 3, 00:16:45.778 "num_base_bdevs_discovered": 2, 00:16:45.778 "num_base_bdevs_operational": 2, 00:16:45.778 "base_bdevs_list": [ 00:16:45.778 { 00:16:45.778 "name": null, 00:16:45.778 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.778 "is_configured": false, 00:16:45.778 "data_offset": 2048, 00:16:45.778 "data_size": 63488 00:16:45.778 }, 00:16:45.778 { 00:16:45.778 "name": "BaseBdev2", 00:16:45.778 "uuid": "820e450b-eff2-4b2d-8909-a9c6c2f2a78b", 00:16:45.778 "is_configured": true, 00:16:45.778 "data_offset": 2048, 00:16:45.778 "data_size": 63488 00:16:45.778 }, 00:16:45.778 { 00:16:45.778 "name": "BaseBdev3", 00:16:45.778 "uuid": "0ad1d1e9-1643-4f06-8846-6e8aefd9cde0", 00:16:45.778 "is_configured": true, 00:16:45.778 "data_offset": 2048, 00:16:45.778 "data_size": 63488 00:16:45.778 } 00:16:45.778 ] 00:16:45.778 }' 00:16:45.778 00:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.778 00:11:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:46.345 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:46.345 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:46.345 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:46.345 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.603 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:46.603 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:46.603 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:46.862 [2024-07-16 00:11:33.562438] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:46.862 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:46.862 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:46.862 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.862 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:47.121 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:47.121 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:47.121 00:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:47.121 [2024-07-16 00:11:34.068149] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:47.121 [2024-07-16 00:11:34.068231] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:47.380 [2024-07-16 00:11:34.080858] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:47.380 [2024-07-16 00:11:34.080889] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:47.380 [2024-07-16 00:11:34.080901] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1012400 name Existed_Raid, state offline 00:16:47.380 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:47.380 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:47.380 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.380 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:47.639 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:47.639 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:47.639 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:47.639 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:47.639 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:47.639 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:47.898 BaseBdev2 00:16:48.156 00:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:48.156 00:11:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:48.156 00:11:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:48.156 00:11:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:48.156 00:11:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:48.156 00:11:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:48.156 00:11:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:48.156 00:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:48.416 [ 00:16:48.416 { 00:16:48.416 "name": "BaseBdev2", 00:16:48.416 "aliases": [ 00:16:48.416 "fc9ae4b5-f649-48f2-9294-664550abab30" 00:16:48.416 ], 00:16:48.416 "product_name": "Malloc disk", 00:16:48.416 "block_size": 512, 00:16:48.416 "num_blocks": 65536, 00:16:48.416 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:16:48.416 "assigned_rate_limits": { 00:16:48.416 "rw_ios_per_sec": 0, 00:16:48.416 "rw_mbytes_per_sec": 0, 00:16:48.416 "r_mbytes_per_sec": 0, 00:16:48.416 "w_mbytes_per_sec": 0 00:16:48.416 }, 00:16:48.416 "claimed": false, 00:16:48.416 "zoned": false, 00:16:48.416 "supported_io_types": { 00:16:48.416 "read": true, 00:16:48.416 "write": true, 00:16:48.416 "unmap": true, 00:16:48.416 "flush": true, 00:16:48.416 "reset": true, 00:16:48.416 "nvme_admin": false, 00:16:48.416 "nvme_io": false, 00:16:48.416 "nvme_io_md": false, 00:16:48.416 "write_zeroes": true, 00:16:48.416 "zcopy": true, 00:16:48.416 "get_zone_info": false, 00:16:48.416 "zone_management": false, 00:16:48.416 "zone_append": false, 00:16:48.416 "compare": false, 00:16:48.416 "compare_and_write": false, 00:16:48.416 "abort": true, 00:16:48.416 "seek_hole": false, 00:16:48.416 "seek_data": false, 00:16:48.416 "copy": true, 00:16:48.416 "nvme_iov_md": false 00:16:48.416 }, 00:16:48.416 "memory_domains": [ 00:16:48.416 { 00:16:48.416 "dma_device_id": "system", 00:16:48.416 "dma_device_type": 1 00:16:48.416 }, 00:16:48.416 { 00:16:48.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.416 "dma_device_type": 2 00:16:48.416 } 00:16:48.416 ], 00:16:48.416 "driver_specific": {} 00:16:48.416 } 00:16:48.416 ] 00:16:48.416 00:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:48.416 00:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:48.416 00:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:48.416 00:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:49.001 BaseBdev3 00:16:49.001 00:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:49.001 00:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:49.001 00:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:49.001 00:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:49.001 00:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:49.001 00:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:49.001 00:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:49.569 00:11:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:50.137 [ 00:16:50.137 { 00:16:50.137 "name": "BaseBdev3", 00:16:50.137 "aliases": [ 00:16:50.137 "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac" 00:16:50.137 ], 00:16:50.137 "product_name": "Malloc disk", 00:16:50.137 "block_size": 512, 00:16:50.137 "num_blocks": 65536, 00:16:50.137 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:16:50.137 "assigned_rate_limits": { 00:16:50.137 "rw_ios_per_sec": 0, 00:16:50.137 "rw_mbytes_per_sec": 0, 00:16:50.137 "r_mbytes_per_sec": 0, 00:16:50.137 "w_mbytes_per_sec": 0 00:16:50.137 }, 00:16:50.137 "claimed": false, 00:16:50.137 "zoned": false, 00:16:50.137 "supported_io_types": { 00:16:50.137 "read": true, 00:16:50.137 "write": true, 00:16:50.137 "unmap": true, 00:16:50.137 "flush": true, 00:16:50.137 "reset": true, 00:16:50.137 "nvme_admin": false, 00:16:50.137 "nvme_io": false, 00:16:50.137 "nvme_io_md": false, 00:16:50.137 "write_zeroes": true, 00:16:50.137 "zcopy": true, 00:16:50.137 "get_zone_info": false, 00:16:50.137 "zone_management": false, 00:16:50.137 "zone_append": false, 00:16:50.137 "compare": false, 00:16:50.137 "compare_and_write": false, 00:16:50.137 "abort": true, 00:16:50.137 "seek_hole": false, 00:16:50.137 "seek_data": false, 00:16:50.137 "copy": true, 00:16:50.137 "nvme_iov_md": false 00:16:50.137 }, 00:16:50.137 "memory_domains": [ 00:16:50.137 { 00:16:50.137 "dma_device_id": "system", 00:16:50.137 "dma_device_type": 1 00:16:50.137 }, 00:16:50.137 { 00:16:50.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.137 "dma_device_type": 2 00:16:50.137 } 00:16:50.137 ], 00:16:50.137 "driver_specific": {} 00:16:50.137 } 00:16:50.137 ] 00:16:50.137 00:11:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:50.137 00:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:50.137 00:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:50.137 00:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:50.705 [2024-07-16 00:11:37.408175] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:50.705 [2024-07-16 00:11:37.408228] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:50.705 [2024-07-16 00:11:37.408247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:50.705 [2024-07-16 00:11:37.409937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.705 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.964 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.964 "name": "Existed_Raid", 00:16:50.964 "uuid": "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d", 00:16:50.964 "strip_size_kb": 0, 00:16:50.964 "state": "configuring", 00:16:50.964 "raid_level": "raid1", 00:16:50.964 "superblock": true, 00:16:50.964 "num_base_bdevs": 3, 00:16:50.964 "num_base_bdevs_discovered": 2, 00:16:50.964 "num_base_bdevs_operational": 3, 00:16:50.964 "base_bdevs_list": [ 00:16:50.964 { 00:16:50.964 "name": "BaseBdev1", 00:16:50.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.964 "is_configured": false, 00:16:50.964 "data_offset": 0, 00:16:50.964 "data_size": 0 00:16:50.964 }, 00:16:50.964 { 00:16:50.964 "name": "BaseBdev2", 00:16:50.964 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:16:50.964 "is_configured": true, 00:16:50.964 "data_offset": 2048, 00:16:50.964 "data_size": 63488 00:16:50.964 }, 00:16:50.964 { 00:16:50.964 "name": "BaseBdev3", 00:16:50.964 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:16:50.964 "is_configured": true, 00:16:50.964 "data_offset": 2048, 00:16:50.964 "data_size": 63488 00:16:50.964 } 00:16:50.964 ] 00:16:50.964 }' 00:16:50.964 00:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.964 00:11:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:51.552 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:51.811 [2024-07-16 00:11:38.535133] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.811 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.071 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.071 "name": "Existed_Raid", 00:16:52.071 "uuid": "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d", 00:16:52.071 "strip_size_kb": 0, 00:16:52.071 "state": "configuring", 00:16:52.071 "raid_level": "raid1", 00:16:52.071 "superblock": true, 00:16:52.071 "num_base_bdevs": 3, 00:16:52.071 "num_base_bdevs_discovered": 1, 00:16:52.071 "num_base_bdevs_operational": 3, 00:16:52.071 "base_bdevs_list": [ 00:16:52.071 { 00:16:52.071 "name": "BaseBdev1", 00:16:52.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:52.071 "is_configured": false, 00:16:52.071 "data_offset": 0, 00:16:52.071 "data_size": 0 00:16:52.071 }, 00:16:52.071 { 00:16:52.071 "name": null, 00:16:52.071 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:16:52.071 "is_configured": false, 00:16:52.071 "data_offset": 2048, 00:16:52.071 "data_size": 63488 00:16:52.071 }, 00:16:52.071 { 00:16:52.071 "name": "BaseBdev3", 00:16:52.071 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:16:52.071 "is_configured": true, 00:16:52.071 "data_offset": 2048, 00:16:52.071 "data_size": 63488 00:16:52.071 } 00:16:52.071 ] 00:16:52.071 }' 00:16:52.071 00:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.071 00:11:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:52.639 00:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.639 00:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:53.247 00:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:53.247 00:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:53.247 [2024-07-16 00:11:40.164683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:53.247 BaseBdev1 00:16:53.247 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:53.247 00:11:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:53.247 00:11:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:53.247 00:11:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:53.247 00:11:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:53.247 00:11:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:53.247 00:11:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:53.526 00:11:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:53.786 [ 00:16:53.786 { 00:16:53.786 "name": "BaseBdev1", 00:16:53.786 "aliases": [ 00:16:53.786 "73dd70dc-2ad5-471e-96e6-c9508fc6934d" 00:16:53.786 ], 00:16:53.786 "product_name": "Malloc disk", 00:16:53.786 "block_size": 512, 00:16:53.786 "num_blocks": 65536, 00:16:53.786 "uuid": "73dd70dc-2ad5-471e-96e6-c9508fc6934d", 00:16:53.786 "assigned_rate_limits": { 00:16:53.786 "rw_ios_per_sec": 0, 00:16:53.786 "rw_mbytes_per_sec": 0, 00:16:53.786 "r_mbytes_per_sec": 0, 00:16:53.786 "w_mbytes_per_sec": 0 00:16:53.786 }, 00:16:53.786 "claimed": true, 00:16:53.786 "claim_type": "exclusive_write", 00:16:53.786 "zoned": false, 00:16:53.786 "supported_io_types": { 00:16:53.786 "read": true, 00:16:53.786 "write": true, 00:16:53.786 "unmap": true, 00:16:53.786 "flush": true, 00:16:53.786 "reset": true, 00:16:53.786 "nvme_admin": false, 00:16:53.786 "nvme_io": false, 00:16:53.786 "nvme_io_md": false, 00:16:53.786 "write_zeroes": true, 00:16:53.786 "zcopy": true, 00:16:53.786 "get_zone_info": false, 00:16:53.786 "zone_management": false, 00:16:53.786 "zone_append": false, 00:16:53.786 "compare": false, 00:16:53.786 "compare_and_write": false, 00:16:53.786 "abort": true, 00:16:53.786 "seek_hole": false, 00:16:53.786 "seek_data": false, 00:16:53.786 "copy": true, 00:16:53.786 "nvme_iov_md": false 00:16:53.786 }, 00:16:53.786 "memory_domains": [ 00:16:53.786 { 00:16:53.786 "dma_device_id": "system", 00:16:53.786 "dma_device_type": 1 00:16:53.786 }, 00:16:53.786 { 00:16:53.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.786 "dma_device_type": 2 00:16:53.786 } 00:16:53.786 ], 00:16:53.786 "driver_specific": {} 00:16:53.786 } 00:16:53.786 ] 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.786 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:54.046 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:54.046 "name": "Existed_Raid", 00:16:54.046 "uuid": "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d", 00:16:54.046 "strip_size_kb": 0, 00:16:54.046 "state": "configuring", 00:16:54.046 "raid_level": "raid1", 00:16:54.046 "superblock": true, 00:16:54.046 "num_base_bdevs": 3, 00:16:54.046 "num_base_bdevs_discovered": 2, 00:16:54.046 "num_base_bdevs_operational": 3, 00:16:54.046 "base_bdevs_list": [ 00:16:54.046 { 00:16:54.046 "name": "BaseBdev1", 00:16:54.046 "uuid": "73dd70dc-2ad5-471e-96e6-c9508fc6934d", 00:16:54.046 "is_configured": true, 00:16:54.046 "data_offset": 2048, 00:16:54.046 "data_size": 63488 00:16:54.046 }, 00:16:54.046 { 00:16:54.046 "name": null, 00:16:54.046 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:16:54.046 "is_configured": false, 00:16:54.046 "data_offset": 2048, 00:16:54.046 "data_size": 63488 00:16:54.046 }, 00:16:54.046 { 00:16:54.046 "name": "BaseBdev3", 00:16:54.046 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:16:54.046 "is_configured": true, 00:16:54.046 "data_offset": 2048, 00:16:54.046 "data_size": 63488 00:16:54.046 } 00:16:54.046 ] 00:16:54.046 }' 00:16:54.046 00:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:54.046 00:11:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:54.615 00:11:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.615 00:11:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:54.874 00:11:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:54.874 00:11:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:55.133 [2024-07-16 00:11:42.037682] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.133 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:55.701 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:55.701 "name": "Existed_Raid", 00:16:55.701 "uuid": "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d", 00:16:55.701 "strip_size_kb": 0, 00:16:55.701 "state": "configuring", 00:16:55.701 "raid_level": "raid1", 00:16:55.701 "superblock": true, 00:16:55.701 "num_base_bdevs": 3, 00:16:55.701 "num_base_bdevs_discovered": 1, 00:16:55.701 "num_base_bdevs_operational": 3, 00:16:55.701 "base_bdevs_list": [ 00:16:55.701 { 00:16:55.701 "name": "BaseBdev1", 00:16:55.701 "uuid": "73dd70dc-2ad5-471e-96e6-c9508fc6934d", 00:16:55.701 "is_configured": true, 00:16:55.701 "data_offset": 2048, 00:16:55.701 "data_size": 63488 00:16:55.701 }, 00:16:55.701 { 00:16:55.701 "name": null, 00:16:55.701 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:16:55.701 "is_configured": false, 00:16:55.701 "data_offset": 2048, 00:16:55.701 "data_size": 63488 00:16:55.701 }, 00:16:55.701 { 00:16:55.701 "name": null, 00:16:55.701 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:16:55.701 "is_configured": false, 00:16:55.701 "data_offset": 2048, 00:16:55.701 "data_size": 63488 00:16:55.701 } 00:16:55.701 ] 00:16:55.701 }' 00:16:55.701 00:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:55.701 00:11:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:56.266 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.266 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:56.531 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:56.531 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:56.789 [2024-07-16 00:11:43.505603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.789 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.049 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.049 "name": "Existed_Raid", 00:16:57.049 "uuid": "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d", 00:16:57.049 "strip_size_kb": 0, 00:16:57.049 "state": "configuring", 00:16:57.049 "raid_level": "raid1", 00:16:57.049 "superblock": true, 00:16:57.049 "num_base_bdevs": 3, 00:16:57.049 "num_base_bdevs_discovered": 2, 00:16:57.049 "num_base_bdevs_operational": 3, 00:16:57.050 "base_bdevs_list": [ 00:16:57.050 { 00:16:57.050 "name": "BaseBdev1", 00:16:57.050 "uuid": "73dd70dc-2ad5-471e-96e6-c9508fc6934d", 00:16:57.050 "is_configured": true, 00:16:57.050 "data_offset": 2048, 00:16:57.050 "data_size": 63488 00:16:57.050 }, 00:16:57.050 { 00:16:57.050 "name": null, 00:16:57.050 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:16:57.050 "is_configured": false, 00:16:57.050 "data_offset": 2048, 00:16:57.050 "data_size": 63488 00:16:57.050 }, 00:16:57.050 { 00:16:57.050 "name": "BaseBdev3", 00:16:57.050 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:16:57.050 "is_configured": true, 00:16:57.050 "data_offset": 2048, 00:16:57.050 "data_size": 63488 00:16:57.050 } 00:16:57.050 ] 00:16:57.050 }' 00:16:57.050 00:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.050 00:11:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:57.617 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.617 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:57.876 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:57.876 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:57.876 [2024-07-16 00:11:44.821128] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.134 00:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.392 00:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.392 "name": "Existed_Raid", 00:16:58.392 "uuid": "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d", 00:16:58.392 "strip_size_kb": 0, 00:16:58.392 "state": "configuring", 00:16:58.392 "raid_level": "raid1", 00:16:58.392 "superblock": true, 00:16:58.392 "num_base_bdevs": 3, 00:16:58.392 "num_base_bdevs_discovered": 1, 00:16:58.392 "num_base_bdevs_operational": 3, 00:16:58.392 "base_bdevs_list": [ 00:16:58.392 { 00:16:58.392 "name": null, 00:16:58.392 "uuid": "73dd70dc-2ad5-471e-96e6-c9508fc6934d", 00:16:58.392 "is_configured": false, 00:16:58.392 "data_offset": 2048, 00:16:58.393 "data_size": 63488 00:16:58.393 }, 00:16:58.393 { 00:16:58.393 "name": null, 00:16:58.393 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:16:58.393 "is_configured": false, 00:16:58.393 "data_offset": 2048, 00:16:58.393 "data_size": 63488 00:16:58.393 }, 00:16:58.393 { 00:16:58.393 "name": "BaseBdev3", 00:16:58.393 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:16:58.393 "is_configured": true, 00:16:58.393 "data_offset": 2048, 00:16:58.393 "data_size": 63488 00:16:58.393 } 00:16:58.393 ] 00:16:58.393 }' 00:16:58.393 00:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.393 00:11:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:58.959 00:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.959 00:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:59.218 00:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:59.218 00:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:59.477 [2024-07-16 00:11:46.177733] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:59.477 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.478 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:59.738 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:59.738 "name": "Existed_Raid", 00:16:59.738 "uuid": "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d", 00:16:59.738 "strip_size_kb": 0, 00:16:59.738 "state": "configuring", 00:16:59.738 "raid_level": "raid1", 00:16:59.738 "superblock": true, 00:16:59.738 "num_base_bdevs": 3, 00:16:59.738 "num_base_bdevs_discovered": 2, 00:16:59.738 "num_base_bdevs_operational": 3, 00:16:59.738 "base_bdevs_list": [ 00:16:59.738 { 00:16:59.738 "name": null, 00:16:59.738 "uuid": "73dd70dc-2ad5-471e-96e6-c9508fc6934d", 00:16:59.738 "is_configured": false, 00:16:59.738 "data_offset": 2048, 00:16:59.738 "data_size": 63488 00:16:59.738 }, 00:16:59.738 { 00:16:59.738 "name": "BaseBdev2", 00:16:59.738 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:16:59.738 "is_configured": true, 00:16:59.738 "data_offset": 2048, 00:16:59.738 "data_size": 63488 00:16:59.738 }, 00:16:59.738 { 00:16:59.738 "name": "BaseBdev3", 00:16:59.738 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:16:59.738 "is_configured": true, 00:16:59.738 "data_offset": 2048, 00:16:59.738 "data_size": 63488 00:16:59.738 } 00:16:59.738 ] 00:16:59.738 }' 00:16:59.738 00:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:59.738 00:11:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:00.305 00:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.305 00:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:00.565 00:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:00.565 00:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.565 00:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:00.565 00:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 73dd70dc-2ad5-471e-96e6-c9508fc6934d 00:17:00.824 [2024-07-16 00:11:47.746938] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:00.824 [2024-07-16 00:11:47.747124] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10081b0 00:17:00.824 [2024-07-16 00:11:47.747138] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:00.824 [2024-07-16 00:11:47.747329] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11c44f0 00:17:00.824 [2024-07-16 00:11:47.747467] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10081b0 00:17:00.824 [2024-07-16 00:11:47.747478] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10081b0 00:17:00.824 [2024-07-16 00:11:47.747587] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:00.824 NewBaseBdev 00:17:00.824 00:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:00.825 00:11:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:00.825 00:11:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:00.825 00:11:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:00.825 00:11:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:00.825 00:11:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:00.825 00:11:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:01.084 00:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:01.343 [ 00:17:01.343 { 00:17:01.343 "name": "NewBaseBdev", 00:17:01.343 "aliases": [ 00:17:01.343 "73dd70dc-2ad5-471e-96e6-c9508fc6934d" 00:17:01.343 ], 00:17:01.343 "product_name": "Malloc disk", 00:17:01.343 "block_size": 512, 00:17:01.343 "num_blocks": 65536, 00:17:01.343 "uuid": "73dd70dc-2ad5-471e-96e6-c9508fc6934d", 00:17:01.343 "assigned_rate_limits": { 00:17:01.343 "rw_ios_per_sec": 0, 00:17:01.343 "rw_mbytes_per_sec": 0, 00:17:01.343 "r_mbytes_per_sec": 0, 00:17:01.343 "w_mbytes_per_sec": 0 00:17:01.343 }, 00:17:01.343 "claimed": true, 00:17:01.343 "claim_type": "exclusive_write", 00:17:01.343 "zoned": false, 00:17:01.343 "supported_io_types": { 00:17:01.343 "read": true, 00:17:01.343 "write": true, 00:17:01.343 "unmap": true, 00:17:01.343 "flush": true, 00:17:01.343 "reset": true, 00:17:01.343 "nvme_admin": false, 00:17:01.344 "nvme_io": false, 00:17:01.344 "nvme_io_md": false, 00:17:01.344 "write_zeroes": true, 00:17:01.344 "zcopy": true, 00:17:01.344 "get_zone_info": false, 00:17:01.344 "zone_management": false, 00:17:01.344 "zone_append": false, 00:17:01.344 "compare": false, 00:17:01.344 "compare_and_write": false, 00:17:01.344 "abort": true, 00:17:01.344 "seek_hole": false, 00:17:01.344 "seek_data": false, 00:17:01.344 "copy": true, 00:17:01.344 "nvme_iov_md": false 00:17:01.344 }, 00:17:01.344 "memory_domains": [ 00:17:01.344 { 00:17:01.344 "dma_device_id": "system", 00:17:01.344 "dma_device_type": 1 00:17:01.344 }, 00:17:01.344 { 00:17:01.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.344 "dma_device_type": 2 00:17:01.344 } 00:17:01.344 ], 00:17:01.344 "driver_specific": {} 00:17:01.344 } 00:17:01.344 ] 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.344 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.603 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.603 "name": "Existed_Raid", 00:17:01.603 "uuid": "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d", 00:17:01.603 "strip_size_kb": 0, 00:17:01.603 "state": "online", 00:17:01.603 "raid_level": "raid1", 00:17:01.603 "superblock": true, 00:17:01.603 "num_base_bdevs": 3, 00:17:01.603 "num_base_bdevs_discovered": 3, 00:17:01.603 "num_base_bdevs_operational": 3, 00:17:01.603 "base_bdevs_list": [ 00:17:01.603 { 00:17:01.603 "name": "NewBaseBdev", 00:17:01.603 "uuid": "73dd70dc-2ad5-471e-96e6-c9508fc6934d", 00:17:01.603 "is_configured": true, 00:17:01.603 "data_offset": 2048, 00:17:01.603 "data_size": 63488 00:17:01.603 }, 00:17:01.603 { 00:17:01.603 "name": "BaseBdev2", 00:17:01.603 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:17:01.603 "is_configured": true, 00:17:01.603 "data_offset": 2048, 00:17:01.603 "data_size": 63488 00:17:01.603 }, 00:17:01.603 { 00:17:01.603 "name": "BaseBdev3", 00:17:01.603 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:17:01.603 "is_configured": true, 00:17:01.603 "data_offset": 2048, 00:17:01.603 "data_size": 63488 00:17:01.603 } 00:17:01.603 ] 00:17:01.603 }' 00:17:01.603 00:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.603 00:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:02.170 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:02.170 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:02.170 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:02.170 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:02.170 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:02.170 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:02.171 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:02.171 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:02.430 [2024-07-16 00:11:49.355518] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:02.690 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:02.690 "name": "Existed_Raid", 00:17:02.690 "aliases": [ 00:17:02.690 "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d" 00:17:02.690 ], 00:17:02.690 "product_name": "Raid Volume", 00:17:02.690 "block_size": 512, 00:17:02.690 "num_blocks": 63488, 00:17:02.690 "uuid": "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d", 00:17:02.690 "assigned_rate_limits": { 00:17:02.690 "rw_ios_per_sec": 0, 00:17:02.690 "rw_mbytes_per_sec": 0, 00:17:02.690 "r_mbytes_per_sec": 0, 00:17:02.690 "w_mbytes_per_sec": 0 00:17:02.690 }, 00:17:02.690 "claimed": false, 00:17:02.690 "zoned": false, 00:17:02.690 "supported_io_types": { 00:17:02.690 "read": true, 00:17:02.690 "write": true, 00:17:02.690 "unmap": false, 00:17:02.690 "flush": false, 00:17:02.690 "reset": true, 00:17:02.690 "nvme_admin": false, 00:17:02.690 "nvme_io": false, 00:17:02.690 "nvme_io_md": false, 00:17:02.690 "write_zeroes": true, 00:17:02.690 "zcopy": false, 00:17:02.690 "get_zone_info": false, 00:17:02.690 "zone_management": false, 00:17:02.690 "zone_append": false, 00:17:02.690 "compare": false, 00:17:02.690 "compare_and_write": false, 00:17:02.690 "abort": false, 00:17:02.690 "seek_hole": false, 00:17:02.690 "seek_data": false, 00:17:02.690 "copy": false, 00:17:02.690 "nvme_iov_md": false 00:17:02.690 }, 00:17:02.690 "memory_domains": [ 00:17:02.690 { 00:17:02.690 "dma_device_id": "system", 00:17:02.690 "dma_device_type": 1 00:17:02.690 }, 00:17:02.690 { 00:17:02.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.690 "dma_device_type": 2 00:17:02.690 }, 00:17:02.690 { 00:17:02.690 "dma_device_id": "system", 00:17:02.690 "dma_device_type": 1 00:17:02.690 }, 00:17:02.690 { 00:17:02.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.690 "dma_device_type": 2 00:17:02.690 }, 00:17:02.690 { 00:17:02.690 "dma_device_id": "system", 00:17:02.690 "dma_device_type": 1 00:17:02.690 }, 00:17:02.690 { 00:17:02.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.690 "dma_device_type": 2 00:17:02.690 } 00:17:02.690 ], 00:17:02.690 "driver_specific": { 00:17:02.690 "raid": { 00:17:02.690 "uuid": "8e17f6e3-630f-4173-a0bc-3f3e01e9fc8d", 00:17:02.690 "strip_size_kb": 0, 00:17:02.690 "state": "online", 00:17:02.690 "raid_level": "raid1", 00:17:02.690 "superblock": true, 00:17:02.690 "num_base_bdevs": 3, 00:17:02.690 "num_base_bdevs_discovered": 3, 00:17:02.690 "num_base_bdevs_operational": 3, 00:17:02.690 "base_bdevs_list": [ 00:17:02.690 { 00:17:02.690 "name": "NewBaseBdev", 00:17:02.690 "uuid": "73dd70dc-2ad5-471e-96e6-c9508fc6934d", 00:17:02.690 "is_configured": true, 00:17:02.690 "data_offset": 2048, 00:17:02.690 "data_size": 63488 00:17:02.690 }, 00:17:02.690 { 00:17:02.690 "name": "BaseBdev2", 00:17:02.690 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:17:02.690 "is_configured": true, 00:17:02.690 "data_offset": 2048, 00:17:02.690 "data_size": 63488 00:17:02.690 }, 00:17:02.690 { 00:17:02.690 "name": "BaseBdev3", 00:17:02.690 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:17:02.690 "is_configured": true, 00:17:02.690 "data_offset": 2048, 00:17:02.690 "data_size": 63488 00:17:02.690 } 00:17:02.690 ] 00:17:02.690 } 00:17:02.690 } 00:17:02.690 }' 00:17:02.690 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:02.690 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:02.690 BaseBdev2 00:17:02.690 BaseBdev3' 00:17:02.690 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:02.690 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:02.690 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.950 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.950 "name": "NewBaseBdev", 00:17:02.950 "aliases": [ 00:17:02.950 "73dd70dc-2ad5-471e-96e6-c9508fc6934d" 00:17:02.950 ], 00:17:02.950 "product_name": "Malloc disk", 00:17:02.950 "block_size": 512, 00:17:02.950 "num_blocks": 65536, 00:17:02.950 "uuid": "73dd70dc-2ad5-471e-96e6-c9508fc6934d", 00:17:02.950 "assigned_rate_limits": { 00:17:02.950 "rw_ios_per_sec": 0, 00:17:02.950 "rw_mbytes_per_sec": 0, 00:17:02.950 "r_mbytes_per_sec": 0, 00:17:02.950 "w_mbytes_per_sec": 0 00:17:02.950 }, 00:17:02.950 "claimed": true, 00:17:02.950 "claim_type": "exclusive_write", 00:17:02.950 "zoned": false, 00:17:02.950 "supported_io_types": { 00:17:02.950 "read": true, 00:17:02.950 "write": true, 00:17:02.950 "unmap": true, 00:17:02.950 "flush": true, 00:17:02.950 "reset": true, 00:17:02.950 "nvme_admin": false, 00:17:02.950 "nvme_io": false, 00:17:02.950 "nvme_io_md": false, 00:17:02.950 "write_zeroes": true, 00:17:02.950 "zcopy": true, 00:17:02.950 "get_zone_info": false, 00:17:02.950 "zone_management": false, 00:17:02.950 "zone_append": false, 00:17:02.950 "compare": false, 00:17:02.950 "compare_and_write": false, 00:17:02.950 "abort": true, 00:17:02.950 "seek_hole": false, 00:17:02.950 "seek_data": false, 00:17:02.950 "copy": true, 00:17:02.950 "nvme_iov_md": false 00:17:02.950 }, 00:17:02.950 "memory_domains": [ 00:17:02.950 { 00:17:02.950 "dma_device_id": "system", 00:17:02.950 "dma_device_type": 1 00:17:02.950 }, 00:17:02.950 { 00:17:02.950 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.950 "dma_device_type": 2 00:17:02.950 } 00:17:02.950 ], 00:17:02.950 "driver_specific": {} 00:17:02.950 }' 00:17:02.950 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.950 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.950 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.950 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.950 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.950 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.950 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.950 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.210 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.210 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.210 00:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.210 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.210 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.210 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:03.210 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.470 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.470 "name": "BaseBdev2", 00:17:03.470 "aliases": [ 00:17:03.470 "fc9ae4b5-f649-48f2-9294-664550abab30" 00:17:03.470 ], 00:17:03.470 "product_name": "Malloc disk", 00:17:03.470 "block_size": 512, 00:17:03.470 "num_blocks": 65536, 00:17:03.470 "uuid": "fc9ae4b5-f649-48f2-9294-664550abab30", 00:17:03.470 "assigned_rate_limits": { 00:17:03.470 "rw_ios_per_sec": 0, 00:17:03.470 "rw_mbytes_per_sec": 0, 00:17:03.470 "r_mbytes_per_sec": 0, 00:17:03.470 "w_mbytes_per_sec": 0 00:17:03.470 }, 00:17:03.470 "claimed": true, 00:17:03.470 "claim_type": "exclusive_write", 00:17:03.470 "zoned": false, 00:17:03.470 "supported_io_types": { 00:17:03.470 "read": true, 00:17:03.470 "write": true, 00:17:03.470 "unmap": true, 00:17:03.470 "flush": true, 00:17:03.470 "reset": true, 00:17:03.470 "nvme_admin": false, 00:17:03.470 "nvme_io": false, 00:17:03.470 "nvme_io_md": false, 00:17:03.470 "write_zeroes": true, 00:17:03.470 "zcopy": true, 00:17:03.470 "get_zone_info": false, 00:17:03.470 "zone_management": false, 00:17:03.470 "zone_append": false, 00:17:03.470 "compare": false, 00:17:03.470 "compare_and_write": false, 00:17:03.470 "abort": true, 00:17:03.470 "seek_hole": false, 00:17:03.470 "seek_data": false, 00:17:03.470 "copy": true, 00:17:03.470 "nvme_iov_md": false 00:17:03.470 }, 00:17:03.470 "memory_domains": [ 00:17:03.470 { 00:17:03.470 "dma_device_id": "system", 00:17:03.470 "dma_device_type": 1 00:17:03.470 }, 00:17:03.470 { 00:17:03.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.470 "dma_device_type": 2 00:17:03.470 } 00:17:03.470 ], 00:17:03.470 "driver_specific": {} 00:17:03.470 }' 00:17:03.470 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.470 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.470 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:03.470 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:03.730 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.989 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.989 "name": "BaseBdev3", 00:17:03.989 "aliases": [ 00:17:03.989 "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac" 00:17:03.989 ], 00:17:03.989 "product_name": "Malloc disk", 00:17:03.989 "block_size": 512, 00:17:03.989 "num_blocks": 65536, 00:17:03.989 "uuid": "3cf5b056-46d3-4adc-9977-c7fe6cfbb6ac", 00:17:03.989 "assigned_rate_limits": { 00:17:03.989 "rw_ios_per_sec": 0, 00:17:03.989 "rw_mbytes_per_sec": 0, 00:17:03.989 "r_mbytes_per_sec": 0, 00:17:03.989 "w_mbytes_per_sec": 0 00:17:03.989 }, 00:17:03.989 "claimed": true, 00:17:03.989 "claim_type": "exclusive_write", 00:17:03.989 "zoned": false, 00:17:03.989 "supported_io_types": { 00:17:03.989 "read": true, 00:17:03.989 "write": true, 00:17:03.989 "unmap": true, 00:17:03.989 "flush": true, 00:17:03.989 "reset": true, 00:17:03.989 "nvme_admin": false, 00:17:03.989 "nvme_io": false, 00:17:03.989 "nvme_io_md": false, 00:17:03.989 "write_zeroes": true, 00:17:03.989 "zcopy": true, 00:17:03.989 "get_zone_info": false, 00:17:03.989 "zone_management": false, 00:17:03.989 "zone_append": false, 00:17:03.989 "compare": false, 00:17:03.989 "compare_and_write": false, 00:17:03.989 "abort": true, 00:17:03.989 "seek_hole": false, 00:17:03.989 "seek_data": false, 00:17:03.989 "copy": true, 00:17:03.989 "nvme_iov_md": false 00:17:03.989 }, 00:17:03.989 "memory_domains": [ 00:17:03.989 { 00:17:03.989 "dma_device_id": "system", 00:17:03.989 "dma_device_type": 1 00:17:03.989 }, 00:17:03.989 { 00:17:03.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.990 "dma_device_type": 2 00:17:03.990 } 00:17:03.990 ], 00:17:03.990 "driver_specific": {} 00:17:03.990 }' 00:17:03.990 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.249 00:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.249 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:04.249 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.249 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.249 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:04.249 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.249 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.509 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:04.509 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.509 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.509 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:04.509 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:05.077 [2024-07-16 00:11:51.797724] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:05.077 [2024-07-16 00:11:51.797759] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:05.077 [2024-07-16 00:11:51.797827] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:05.077 [2024-07-16 00:11:51.798140] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:05.077 [2024-07-16 00:11:51.798154] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10081b0 name Existed_Raid, state offline 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3539908 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3539908 ']' 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3539908 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3539908 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3539908' 00:17:05.077 killing process with pid 3539908 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3539908 00:17:05.077 [2024-07-16 00:11:51.876763] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:05.077 00:11:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3539908 00:17:05.077 [2024-07-16 00:11:51.938905] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:05.336 00:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:05.336 00:17:05.336 real 0m30.851s 00:17:05.336 user 0m56.461s 00:17:05.336 sys 0m5.461s 00:17:05.336 00:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:05.336 00:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:05.336 ************************************ 00:17:05.336 END TEST raid_state_function_test_sb 00:17:05.336 ************************************ 00:17:05.596 00:11:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:05.596 00:11:52 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:17:05.596 00:11:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:05.596 00:11:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:05.596 00:11:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:05.596 ************************************ 00:17:05.596 START TEST raid_superblock_test 00:17:05.596 ************************************ 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3544479 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3544479 /var/tmp/spdk-raid.sock 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3544479 ']' 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:05.596 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:05.596 00:11:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.596 [2024-07-16 00:11:52.426438] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:17:05.596 [2024-07-16 00:11:52.426510] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3544479 ] 00:17:05.856 [2024-07-16 00:11:52.555132] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:05.856 [2024-07-16 00:11:52.657632] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:05.856 [2024-07-16 00:11:52.728533] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:05.856 [2024-07-16 00:11:52.728571] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:06.433 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:06.720 malloc1 00:17:06.720 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:06.984 [2024-07-16 00:11:53.831604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:06.984 [2024-07-16 00:11:53.831651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.984 [2024-07-16 00:11:53.831671] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb77570 00:17:06.984 [2024-07-16 00:11:53.831684] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.984 [2024-07-16 00:11:53.833327] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.984 [2024-07-16 00:11:53.833356] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:06.984 pt1 00:17:06.984 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:06.984 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:06.984 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:06.984 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:06.984 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:06.984 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:06.984 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:06.984 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:06.984 00:11:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:07.553 malloc2 00:17:07.553 00:11:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:08.121 [2024-07-16 00:11:54.859095] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:08.121 [2024-07-16 00:11:54.859157] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:08.121 [2024-07-16 00:11:54.859175] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb78970 00:17:08.121 [2024-07-16 00:11:54.859188] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:08.121 [2024-07-16 00:11:54.860916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:08.121 [2024-07-16 00:11:54.860952] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:08.121 pt2 00:17:08.121 00:11:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:08.121 00:11:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:08.121 00:11:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:08.121 00:11:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:08.121 00:11:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:08.121 00:11:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:08.121 00:11:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:08.121 00:11:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:08.121 00:11:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:08.689 malloc3 00:17:08.689 00:11:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:08.948 [2024-07-16 00:11:55.895770] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:08.948 [2024-07-16 00:11:55.895852] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:08.948 [2024-07-16 00:11:55.895877] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd0f340 00:17:08.948 [2024-07-16 00:11:55.895890] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:08.948 [2024-07-16 00:11:55.898015] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:08.948 [2024-07-16 00:11:55.898055] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:09.207 pt3 00:17:09.207 00:11:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:09.207 00:11:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:09.207 00:11:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:17:09.466 [2024-07-16 00:11:56.413121] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:09.466 [2024-07-16 00:11:56.414563] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:09.466 [2024-07-16 00:11:56.414625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:09.466 [2024-07-16 00:11:56.414795] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb6fea0 00:17:09.466 [2024-07-16 00:11:56.414807] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:09.466 [2024-07-16 00:11:56.415030] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb77240 00:17:09.466 [2024-07-16 00:11:56.415199] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb6fea0 00:17:09.466 [2024-07-16 00:11:56.415210] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb6fea0 00:17:09.466 [2024-07-16 00:11:56.415325] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.725 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:09.985 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.985 "name": "raid_bdev1", 00:17:09.985 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:09.985 "strip_size_kb": 0, 00:17:09.985 "state": "online", 00:17:09.985 "raid_level": "raid1", 00:17:09.985 "superblock": true, 00:17:09.985 "num_base_bdevs": 3, 00:17:09.985 "num_base_bdevs_discovered": 3, 00:17:09.985 "num_base_bdevs_operational": 3, 00:17:09.985 "base_bdevs_list": [ 00:17:09.985 { 00:17:09.985 "name": "pt1", 00:17:09.985 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:09.985 "is_configured": true, 00:17:09.985 "data_offset": 2048, 00:17:09.985 "data_size": 63488 00:17:09.985 }, 00:17:09.985 { 00:17:09.985 "name": "pt2", 00:17:09.985 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:09.985 "is_configured": true, 00:17:09.985 "data_offset": 2048, 00:17:09.985 "data_size": 63488 00:17:09.985 }, 00:17:09.985 { 00:17:09.985 "name": "pt3", 00:17:09.985 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:09.985 "is_configured": true, 00:17:09.985 "data_offset": 2048, 00:17:09.985 "data_size": 63488 00:17:09.985 } 00:17:09.985 ] 00:17:09.985 }' 00:17:09.985 00:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.985 00:11:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.554 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:10.554 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:10.554 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:10.554 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:10.554 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:10.554 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:10.554 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:10.554 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:10.813 [2024-07-16 00:11:57.540336] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:10.813 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:10.813 "name": "raid_bdev1", 00:17:10.813 "aliases": [ 00:17:10.813 "dc0e6dce-cd97-4679-9f85-8cd9511fb301" 00:17:10.813 ], 00:17:10.813 "product_name": "Raid Volume", 00:17:10.813 "block_size": 512, 00:17:10.813 "num_blocks": 63488, 00:17:10.813 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:10.813 "assigned_rate_limits": { 00:17:10.813 "rw_ios_per_sec": 0, 00:17:10.813 "rw_mbytes_per_sec": 0, 00:17:10.813 "r_mbytes_per_sec": 0, 00:17:10.813 "w_mbytes_per_sec": 0 00:17:10.813 }, 00:17:10.813 "claimed": false, 00:17:10.813 "zoned": false, 00:17:10.813 "supported_io_types": { 00:17:10.813 "read": true, 00:17:10.813 "write": true, 00:17:10.813 "unmap": false, 00:17:10.813 "flush": false, 00:17:10.813 "reset": true, 00:17:10.813 "nvme_admin": false, 00:17:10.813 "nvme_io": false, 00:17:10.813 "nvme_io_md": false, 00:17:10.813 "write_zeroes": true, 00:17:10.813 "zcopy": false, 00:17:10.813 "get_zone_info": false, 00:17:10.813 "zone_management": false, 00:17:10.813 "zone_append": false, 00:17:10.813 "compare": false, 00:17:10.813 "compare_and_write": false, 00:17:10.813 "abort": false, 00:17:10.813 "seek_hole": false, 00:17:10.813 "seek_data": false, 00:17:10.813 "copy": false, 00:17:10.813 "nvme_iov_md": false 00:17:10.813 }, 00:17:10.813 "memory_domains": [ 00:17:10.813 { 00:17:10.813 "dma_device_id": "system", 00:17:10.813 "dma_device_type": 1 00:17:10.813 }, 00:17:10.813 { 00:17:10.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.813 "dma_device_type": 2 00:17:10.813 }, 00:17:10.813 { 00:17:10.813 "dma_device_id": "system", 00:17:10.813 "dma_device_type": 1 00:17:10.813 }, 00:17:10.813 { 00:17:10.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.813 "dma_device_type": 2 00:17:10.813 }, 00:17:10.813 { 00:17:10.813 "dma_device_id": "system", 00:17:10.813 "dma_device_type": 1 00:17:10.813 }, 00:17:10.813 { 00:17:10.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.813 "dma_device_type": 2 00:17:10.813 } 00:17:10.813 ], 00:17:10.813 "driver_specific": { 00:17:10.813 "raid": { 00:17:10.813 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:10.813 "strip_size_kb": 0, 00:17:10.813 "state": "online", 00:17:10.813 "raid_level": "raid1", 00:17:10.813 "superblock": true, 00:17:10.813 "num_base_bdevs": 3, 00:17:10.813 "num_base_bdevs_discovered": 3, 00:17:10.813 "num_base_bdevs_operational": 3, 00:17:10.813 "base_bdevs_list": [ 00:17:10.813 { 00:17:10.813 "name": "pt1", 00:17:10.813 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:10.813 "is_configured": true, 00:17:10.813 "data_offset": 2048, 00:17:10.813 "data_size": 63488 00:17:10.813 }, 00:17:10.813 { 00:17:10.813 "name": "pt2", 00:17:10.813 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:10.813 "is_configured": true, 00:17:10.813 "data_offset": 2048, 00:17:10.813 "data_size": 63488 00:17:10.813 }, 00:17:10.813 { 00:17:10.813 "name": "pt3", 00:17:10.813 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:10.813 "is_configured": true, 00:17:10.813 "data_offset": 2048, 00:17:10.813 "data_size": 63488 00:17:10.813 } 00:17:10.813 ] 00:17:10.813 } 00:17:10.813 } 00:17:10.813 }' 00:17:10.813 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:10.813 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:10.813 pt2 00:17:10.813 pt3' 00:17:10.813 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:10.814 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:10.814 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:11.073 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:11.073 "name": "pt1", 00:17:11.073 "aliases": [ 00:17:11.073 "00000000-0000-0000-0000-000000000001" 00:17:11.073 ], 00:17:11.073 "product_name": "passthru", 00:17:11.073 "block_size": 512, 00:17:11.073 "num_blocks": 65536, 00:17:11.073 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:11.073 "assigned_rate_limits": { 00:17:11.073 "rw_ios_per_sec": 0, 00:17:11.073 "rw_mbytes_per_sec": 0, 00:17:11.073 "r_mbytes_per_sec": 0, 00:17:11.073 "w_mbytes_per_sec": 0 00:17:11.073 }, 00:17:11.073 "claimed": true, 00:17:11.073 "claim_type": "exclusive_write", 00:17:11.073 "zoned": false, 00:17:11.073 "supported_io_types": { 00:17:11.073 "read": true, 00:17:11.073 "write": true, 00:17:11.073 "unmap": true, 00:17:11.073 "flush": true, 00:17:11.073 "reset": true, 00:17:11.073 "nvme_admin": false, 00:17:11.073 "nvme_io": false, 00:17:11.073 "nvme_io_md": false, 00:17:11.073 "write_zeroes": true, 00:17:11.073 "zcopy": true, 00:17:11.073 "get_zone_info": false, 00:17:11.073 "zone_management": false, 00:17:11.073 "zone_append": false, 00:17:11.073 "compare": false, 00:17:11.073 "compare_and_write": false, 00:17:11.073 "abort": true, 00:17:11.073 "seek_hole": false, 00:17:11.073 "seek_data": false, 00:17:11.073 "copy": true, 00:17:11.073 "nvme_iov_md": false 00:17:11.073 }, 00:17:11.073 "memory_domains": [ 00:17:11.073 { 00:17:11.073 "dma_device_id": "system", 00:17:11.073 "dma_device_type": 1 00:17:11.073 }, 00:17:11.073 { 00:17:11.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.073 "dma_device_type": 2 00:17:11.073 } 00:17:11.073 ], 00:17:11.073 "driver_specific": { 00:17:11.073 "passthru": { 00:17:11.073 "name": "pt1", 00:17:11.073 "base_bdev_name": "malloc1" 00:17:11.073 } 00:17:11.073 } 00:17:11.073 }' 00:17:11.073 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:11.073 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:11.073 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:11.073 00:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:11.073 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:11.332 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:11.332 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:11.332 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:11.332 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:11.332 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:11.332 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:11.590 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:11.590 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:11.590 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:11.590 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:11.849 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:11.849 "name": "pt2", 00:17:11.849 "aliases": [ 00:17:11.849 "00000000-0000-0000-0000-000000000002" 00:17:11.849 ], 00:17:11.849 "product_name": "passthru", 00:17:11.849 "block_size": 512, 00:17:11.849 "num_blocks": 65536, 00:17:11.849 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:11.849 "assigned_rate_limits": { 00:17:11.849 "rw_ios_per_sec": 0, 00:17:11.849 "rw_mbytes_per_sec": 0, 00:17:11.849 "r_mbytes_per_sec": 0, 00:17:11.849 "w_mbytes_per_sec": 0 00:17:11.849 }, 00:17:11.849 "claimed": true, 00:17:11.849 "claim_type": "exclusive_write", 00:17:11.849 "zoned": false, 00:17:11.849 "supported_io_types": { 00:17:11.849 "read": true, 00:17:11.849 "write": true, 00:17:11.849 "unmap": true, 00:17:11.849 "flush": true, 00:17:11.849 "reset": true, 00:17:11.849 "nvme_admin": false, 00:17:11.849 "nvme_io": false, 00:17:11.849 "nvme_io_md": false, 00:17:11.849 "write_zeroes": true, 00:17:11.849 "zcopy": true, 00:17:11.849 "get_zone_info": false, 00:17:11.849 "zone_management": false, 00:17:11.849 "zone_append": false, 00:17:11.849 "compare": false, 00:17:11.849 "compare_and_write": false, 00:17:11.849 "abort": true, 00:17:11.849 "seek_hole": false, 00:17:11.849 "seek_data": false, 00:17:11.849 "copy": true, 00:17:11.849 "nvme_iov_md": false 00:17:11.849 }, 00:17:11.849 "memory_domains": [ 00:17:11.849 { 00:17:11.849 "dma_device_id": "system", 00:17:11.849 "dma_device_type": 1 00:17:11.849 }, 00:17:11.849 { 00:17:11.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.849 "dma_device_type": 2 00:17:11.849 } 00:17:11.849 ], 00:17:11.849 "driver_specific": { 00:17:11.849 "passthru": { 00:17:11.849 "name": "pt2", 00:17:11.849 "base_bdev_name": "malloc2" 00:17:11.849 } 00:17:11.849 } 00:17:11.849 }' 00:17:11.849 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:11.849 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:11.849 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:11.849 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:11.849 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:11.849 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:11.849 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.108 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.108 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:12.108 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:12.108 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:12.108 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:12.108 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:12.108 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:12.108 00:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:12.368 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:12.368 "name": "pt3", 00:17:12.368 "aliases": [ 00:17:12.368 "00000000-0000-0000-0000-000000000003" 00:17:12.368 ], 00:17:12.368 "product_name": "passthru", 00:17:12.368 "block_size": 512, 00:17:12.368 "num_blocks": 65536, 00:17:12.368 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:12.368 "assigned_rate_limits": { 00:17:12.368 "rw_ios_per_sec": 0, 00:17:12.368 "rw_mbytes_per_sec": 0, 00:17:12.368 "r_mbytes_per_sec": 0, 00:17:12.368 "w_mbytes_per_sec": 0 00:17:12.368 }, 00:17:12.368 "claimed": true, 00:17:12.368 "claim_type": "exclusive_write", 00:17:12.368 "zoned": false, 00:17:12.368 "supported_io_types": { 00:17:12.368 "read": true, 00:17:12.368 "write": true, 00:17:12.368 "unmap": true, 00:17:12.368 "flush": true, 00:17:12.368 "reset": true, 00:17:12.368 "nvme_admin": false, 00:17:12.368 "nvme_io": false, 00:17:12.368 "nvme_io_md": false, 00:17:12.368 "write_zeroes": true, 00:17:12.368 "zcopy": true, 00:17:12.368 "get_zone_info": false, 00:17:12.368 "zone_management": false, 00:17:12.368 "zone_append": false, 00:17:12.368 "compare": false, 00:17:12.368 "compare_and_write": false, 00:17:12.368 "abort": true, 00:17:12.368 "seek_hole": false, 00:17:12.368 "seek_data": false, 00:17:12.368 "copy": true, 00:17:12.368 "nvme_iov_md": false 00:17:12.368 }, 00:17:12.368 "memory_domains": [ 00:17:12.368 { 00:17:12.368 "dma_device_id": "system", 00:17:12.368 "dma_device_type": 1 00:17:12.368 }, 00:17:12.368 { 00:17:12.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.368 "dma_device_type": 2 00:17:12.368 } 00:17:12.368 ], 00:17:12.368 "driver_specific": { 00:17:12.368 "passthru": { 00:17:12.368 "name": "pt3", 00:17:12.368 "base_bdev_name": "malloc3" 00:17:12.368 } 00:17:12.368 } 00:17:12.368 }' 00:17:12.368 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.368 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.368 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:12.368 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.627 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.627 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:12.627 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.628 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.628 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:12.628 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:12.628 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:12.628 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:12.628 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:12.628 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:12.887 [2024-07-16 00:11:59.754192] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:12.887 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=dc0e6dce-cd97-4679-9f85-8cd9511fb301 00:17:12.887 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z dc0e6dce-cd97-4679-9f85-8cd9511fb301 ']' 00:17:12.887 00:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:13.145 [2024-07-16 00:11:59.998556] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:13.145 [2024-07-16 00:11:59.998580] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:13.145 [2024-07-16 00:11:59.998638] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:13.145 [2024-07-16 00:11:59.998720] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:13.145 [2024-07-16 00:11:59.998733] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb6fea0 name raid_bdev1, state offline 00:17:13.145 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.145 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:13.404 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:13.404 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:13.404 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:13.404 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:13.663 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:13.663 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:13.663 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:13.663 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:13.923 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:13.923 00:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:14.182 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:14.441 [2024-07-16 00:12:01.249816] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:14.441 [2024-07-16 00:12:01.251424] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:14.441 [2024-07-16 00:12:01.251474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:14.441 [2024-07-16 00:12:01.251526] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:14.441 [2024-07-16 00:12:01.251570] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:14.441 [2024-07-16 00:12:01.251595] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:14.441 [2024-07-16 00:12:01.251614] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:14.441 [2024-07-16 00:12:01.251625] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd1aff0 name raid_bdev1, state configuring 00:17:14.441 request: 00:17:14.441 { 00:17:14.441 "name": "raid_bdev1", 00:17:14.441 "raid_level": "raid1", 00:17:14.441 "base_bdevs": [ 00:17:14.441 "malloc1", 00:17:14.441 "malloc2", 00:17:14.441 "malloc3" 00:17:14.441 ], 00:17:14.441 "superblock": false, 00:17:14.441 "method": "bdev_raid_create", 00:17:14.441 "req_id": 1 00:17:14.441 } 00:17:14.441 Got JSON-RPC error response 00:17:14.441 response: 00:17:14.441 { 00:17:14.441 "code": -17, 00:17:14.441 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:14.441 } 00:17:14.441 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:14.441 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:14.441 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:14.441 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:14.442 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.442 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:14.700 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:14.700 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:14.700 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:14.958 [2024-07-16 00:12:01.666868] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:14.958 [2024-07-16 00:12:01.666920] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:14.958 [2024-07-16 00:12:01.666951] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb777a0 00:17:14.958 [2024-07-16 00:12:01.666964] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:14.958 [2024-07-16 00:12:01.668920] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:14.958 [2024-07-16 00:12:01.668958] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:14.958 [2024-07-16 00:12:01.669035] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:14.958 [2024-07-16 00:12:01.669064] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:14.958 pt1 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.958 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:15.217 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.217 "name": "raid_bdev1", 00:17:15.217 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:15.217 "strip_size_kb": 0, 00:17:15.217 "state": "configuring", 00:17:15.217 "raid_level": "raid1", 00:17:15.217 "superblock": true, 00:17:15.217 "num_base_bdevs": 3, 00:17:15.217 "num_base_bdevs_discovered": 1, 00:17:15.217 "num_base_bdevs_operational": 3, 00:17:15.217 "base_bdevs_list": [ 00:17:15.217 { 00:17:15.217 "name": "pt1", 00:17:15.217 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:15.217 "is_configured": true, 00:17:15.217 "data_offset": 2048, 00:17:15.217 "data_size": 63488 00:17:15.217 }, 00:17:15.217 { 00:17:15.217 "name": null, 00:17:15.217 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:15.217 "is_configured": false, 00:17:15.217 "data_offset": 2048, 00:17:15.217 "data_size": 63488 00:17:15.217 }, 00:17:15.217 { 00:17:15.217 "name": null, 00:17:15.217 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:15.217 "is_configured": false, 00:17:15.217 "data_offset": 2048, 00:17:15.217 "data_size": 63488 00:17:15.217 } 00:17:15.217 ] 00:17:15.217 }' 00:17:15.217 00:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.217 00:12:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.784 00:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:15.784 00:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:16.043 [2024-07-16 00:12:02.837992] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:16.043 [2024-07-16 00:12:02.838045] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:16.043 [2024-07-16 00:12:02.838064] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb6ea10 00:17:16.043 [2024-07-16 00:12:02.838076] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:16.043 [2024-07-16 00:12:02.838447] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:16.043 [2024-07-16 00:12:02.838467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:16.043 [2024-07-16 00:12:02.838536] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:16.043 [2024-07-16 00:12:02.838557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:16.043 pt2 00:17:16.043 00:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:16.301 [2024-07-16 00:12:03.094678] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.301 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:16.558 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.558 "name": "raid_bdev1", 00:17:16.558 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:16.558 "strip_size_kb": 0, 00:17:16.558 "state": "configuring", 00:17:16.558 "raid_level": "raid1", 00:17:16.558 "superblock": true, 00:17:16.558 "num_base_bdevs": 3, 00:17:16.558 "num_base_bdevs_discovered": 1, 00:17:16.558 "num_base_bdevs_operational": 3, 00:17:16.558 "base_bdevs_list": [ 00:17:16.558 { 00:17:16.558 "name": "pt1", 00:17:16.558 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:16.558 "is_configured": true, 00:17:16.558 "data_offset": 2048, 00:17:16.558 "data_size": 63488 00:17:16.558 }, 00:17:16.558 { 00:17:16.558 "name": null, 00:17:16.558 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:16.558 "is_configured": false, 00:17:16.558 "data_offset": 2048, 00:17:16.558 "data_size": 63488 00:17:16.558 }, 00:17:16.558 { 00:17:16.558 "name": null, 00:17:16.558 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:16.558 "is_configured": false, 00:17:16.558 "data_offset": 2048, 00:17:16.558 "data_size": 63488 00:17:16.558 } 00:17:16.558 ] 00:17:16.558 }' 00:17:16.558 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.558 00:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.124 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:17.124 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:17.124 00:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:17.382 [2024-07-16 00:12:04.113379] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:17.382 [2024-07-16 00:12:04.113432] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:17.382 [2024-07-16 00:12:04.113454] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb77a10 00:17:17.382 [2024-07-16 00:12:04.113467] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:17.382 [2024-07-16 00:12:04.113834] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:17.382 [2024-07-16 00:12:04.113854] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:17.382 [2024-07-16 00:12:04.113922] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:17.382 [2024-07-16 00:12:04.113951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:17.382 pt2 00:17:17.382 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:17.382 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:17.383 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:17.641 [2024-07-16 00:12:04.370063] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:17.641 [2024-07-16 00:12:04.370095] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:17.641 [2024-07-16 00:12:04.370111] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb6e6c0 00:17:17.641 [2024-07-16 00:12:04.370123] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:17.641 [2024-07-16 00:12:04.370433] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:17.641 [2024-07-16 00:12:04.370451] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:17.641 [2024-07-16 00:12:04.370506] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:17.641 [2024-07-16 00:12:04.370524] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:17.641 [2024-07-16 00:12:04.370636] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd11c00 00:17:17.641 [2024-07-16 00:12:04.370647] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:17.641 [2024-07-16 00:12:04.370817] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb71610 00:17:17.641 [2024-07-16 00:12:04.370964] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd11c00 00:17:17.641 [2024-07-16 00:12:04.370975] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd11c00 00:17:17.641 [2024-07-16 00:12:04.371083] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:17.641 pt3 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.641 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:17.898 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.898 "name": "raid_bdev1", 00:17:17.898 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:17.898 "strip_size_kb": 0, 00:17:17.898 "state": "online", 00:17:17.898 "raid_level": "raid1", 00:17:17.898 "superblock": true, 00:17:17.898 "num_base_bdevs": 3, 00:17:17.898 "num_base_bdevs_discovered": 3, 00:17:17.898 "num_base_bdevs_operational": 3, 00:17:17.898 "base_bdevs_list": [ 00:17:17.898 { 00:17:17.898 "name": "pt1", 00:17:17.898 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:17.898 "is_configured": true, 00:17:17.898 "data_offset": 2048, 00:17:17.898 "data_size": 63488 00:17:17.898 }, 00:17:17.898 { 00:17:17.898 "name": "pt2", 00:17:17.898 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:17.898 "is_configured": true, 00:17:17.898 "data_offset": 2048, 00:17:17.898 "data_size": 63488 00:17:17.898 }, 00:17:17.898 { 00:17:17.898 "name": "pt3", 00:17:17.898 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:17.898 "is_configured": true, 00:17:17.898 "data_offset": 2048, 00:17:17.898 "data_size": 63488 00:17:17.898 } 00:17:17.898 ] 00:17:17.898 }' 00:17:17.898 00:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.898 00:12:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:18.463 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:18.463 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:18.463 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:18.463 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:18.463 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:18.463 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:18.463 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:18.463 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:18.720 [2024-07-16 00:12:05.525418] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:18.720 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:18.720 "name": "raid_bdev1", 00:17:18.720 "aliases": [ 00:17:18.720 "dc0e6dce-cd97-4679-9f85-8cd9511fb301" 00:17:18.720 ], 00:17:18.720 "product_name": "Raid Volume", 00:17:18.720 "block_size": 512, 00:17:18.720 "num_blocks": 63488, 00:17:18.720 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:18.720 "assigned_rate_limits": { 00:17:18.720 "rw_ios_per_sec": 0, 00:17:18.720 "rw_mbytes_per_sec": 0, 00:17:18.720 "r_mbytes_per_sec": 0, 00:17:18.720 "w_mbytes_per_sec": 0 00:17:18.720 }, 00:17:18.720 "claimed": false, 00:17:18.720 "zoned": false, 00:17:18.720 "supported_io_types": { 00:17:18.720 "read": true, 00:17:18.720 "write": true, 00:17:18.720 "unmap": false, 00:17:18.720 "flush": false, 00:17:18.720 "reset": true, 00:17:18.720 "nvme_admin": false, 00:17:18.720 "nvme_io": false, 00:17:18.720 "nvme_io_md": false, 00:17:18.720 "write_zeroes": true, 00:17:18.720 "zcopy": false, 00:17:18.720 "get_zone_info": false, 00:17:18.720 "zone_management": false, 00:17:18.720 "zone_append": false, 00:17:18.720 "compare": false, 00:17:18.720 "compare_and_write": false, 00:17:18.720 "abort": false, 00:17:18.720 "seek_hole": false, 00:17:18.720 "seek_data": false, 00:17:18.720 "copy": false, 00:17:18.720 "nvme_iov_md": false 00:17:18.721 }, 00:17:18.721 "memory_domains": [ 00:17:18.721 { 00:17:18.721 "dma_device_id": "system", 00:17:18.721 "dma_device_type": 1 00:17:18.721 }, 00:17:18.721 { 00:17:18.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.721 "dma_device_type": 2 00:17:18.721 }, 00:17:18.721 { 00:17:18.721 "dma_device_id": "system", 00:17:18.721 "dma_device_type": 1 00:17:18.721 }, 00:17:18.721 { 00:17:18.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.721 "dma_device_type": 2 00:17:18.721 }, 00:17:18.721 { 00:17:18.721 "dma_device_id": "system", 00:17:18.721 "dma_device_type": 1 00:17:18.721 }, 00:17:18.721 { 00:17:18.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.721 "dma_device_type": 2 00:17:18.721 } 00:17:18.721 ], 00:17:18.721 "driver_specific": { 00:17:18.721 "raid": { 00:17:18.721 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:18.721 "strip_size_kb": 0, 00:17:18.721 "state": "online", 00:17:18.721 "raid_level": "raid1", 00:17:18.721 "superblock": true, 00:17:18.721 "num_base_bdevs": 3, 00:17:18.721 "num_base_bdevs_discovered": 3, 00:17:18.721 "num_base_bdevs_operational": 3, 00:17:18.721 "base_bdevs_list": [ 00:17:18.721 { 00:17:18.721 "name": "pt1", 00:17:18.721 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:18.721 "is_configured": true, 00:17:18.721 "data_offset": 2048, 00:17:18.721 "data_size": 63488 00:17:18.721 }, 00:17:18.721 { 00:17:18.721 "name": "pt2", 00:17:18.721 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:18.721 "is_configured": true, 00:17:18.721 "data_offset": 2048, 00:17:18.721 "data_size": 63488 00:17:18.721 }, 00:17:18.721 { 00:17:18.721 "name": "pt3", 00:17:18.721 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:18.721 "is_configured": true, 00:17:18.721 "data_offset": 2048, 00:17:18.721 "data_size": 63488 00:17:18.721 } 00:17:18.721 ] 00:17:18.721 } 00:17:18.721 } 00:17:18.721 }' 00:17:18.721 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:18.721 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:18.721 pt2 00:17:18.721 pt3' 00:17:18.721 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:18.721 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:18.721 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:18.978 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:18.978 "name": "pt1", 00:17:18.978 "aliases": [ 00:17:18.978 "00000000-0000-0000-0000-000000000001" 00:17:18.978 ], 00:17:18.978 "product_name": "passthru", 00:17:18.978 "block_size": 512, 00:17:18.978 "num_blocks": 65536, 00:17:18.978 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:18.978 "assigned_rate_limits": { 00:17:18.978 "rw_ios_per_sec": 0, 00:17:18.978 "rw_mbytes_per_sec": 0, 00:17:18.978 "r_mbytes_per_sec": 0, 00:17:18.978 "w_mbytes_per_sec": 0 00:17:18.978 }, 00:17:18.978 "claimed": true, 00:17:18.978 "claim_type": "exclusive_write", 00:17:18.978 "zoned": false, 00:17:18.978 "supported_io_types": { 00:17:18.978 "read": true, 00:17:18.978 "write": true, 00:17:18.978 "unmap": true, 00:17:18.978 "flush": true, 00:17:18.978 "reset": true, 00:17:18.978 "nvme_admin": false, 00:17:18.978 "nvme_io": false, 00:17:18.978 "nvme_io_md": false, 00:17:18.978 "write_zeroes": true, 00:17:18.978 "zcopy": true, 00:17:18.978 "get_zone_info": false, 00:17:18.978 "zone_management": false, 00:17:18.978 "zone_append": false, 00:17:18.978 "compare": false, 00:17:18.978 "compare_and_write": false, 00:17:18.978 "abort": true, 00:17:18.978 "seek_hole": false, 00:17:18.978 "seek_data": false, 00:17:18.978 "copy": true, 00:17:18.978 "nvme_iov_md": false 00:17:18.978 }, 00:17:18.978 "memory_domains": [ 00:17:18.978 { 00:17:18.978 "dma_device_id": "system", 00:17:18.978 "dma_device_type": 1 00:17:18.978 }, 00:17:18.978 { 00:17:18.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.978 "dma_device_type": 2 00:17:18.978 } 00:17:18.978 ], 00:17:18.978 "driver_specific": { 00:17:18.978 "passthru": { 00:17:18.978 "name": "pt1", 00:17:18.978 "base_bdev_name": "malloc1" 00:17:18.978 } 00:17:18.978 } 00:17:18.978 }' 00:17:18.978 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.978 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.235 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:19.235 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:19.235 00:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:19.235 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:19.235 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.235 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.235 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:19.235 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:19.494 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:19.494 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:19.494 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:19.494 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:19.494 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:19.752 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:19.752 "name": "pt2", 00:17:19.752 "aliases": [ 00:17:19.752 "00000000-0000-0000-0000-000000000002" 00:17:19.752 ], 00:17:19.752 "product_name": "passthru", 00:17:19.752 "block_size": 512, 00:17:19.752 "num_blocks": 65536, 00:17:19.752 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:19.752 "assigned_rate_limits": { 00:17:19.752 "rw_ios_per_sec": 0, 00:17:19.752 "rw_mbytes_per_sec": 0, 00:17:19.752 "r_mbytes_per_sec": 0, 00:17:19.752 "w_mbytes_per_sec": 0 00:17:19.752 }, 00:17:19.752 "claimed": true, 00:17:19.752 "claim_type": "exclusive_write", 00:17:19.752 "zoned": false, 00:17:19.752 "supported_io_types": { 00:17:19.752 "read": true, 00:17:19.752 "write": true, 00:17:19.752 "unmap": true, 00:17:19.752 "flush": true, 00:17:19.752 "reset": true, 00:17:19.752 "nvme_admin": false, 00:17:19.752 "nvme_io": false, 00:17:19.752 "nvme_io_md": false, 00:17:19.752 "write_zeroes": true, 00:17:19.752 "zcopy": true, 00:17:19.752 "get_zone_info": false, 00:17:19.752 "zone_management": false, 00:17:19.752 "zone_append": false, 00:17:19.752 "compare": false, 00:17:19.752 "compare_and_write": false, 00:17:19.752 "abort": true, 00:17:19.752 "seek_hole": false, 00:17:19.752 "seek_data": false, 00:17:19.752 "copy": true, 00:17:19.752 "nvme_iov_md": false 00:17:19.752 }, 00:17:19.752 "memory_domains": [ 00:17:19.752 { 00:17:19.752 "dma_device_id": "system", 00:17:19.752 "dma_device_type": 1 00:17:19.752 }, 00:17:19.752 { 00:17:19.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.752 "dma_device_type": 2 00:17:19.752 } 00:17:19.752 ], 00:17:19.752 "driver_specific": { 00:17:19.752 "passthru": { 00:17:19.752 "name": "pt2", 00:17:19.752 "base_bdev_name": "malloc2" 00:17:19.752 } 00:17:19.752 } 00:17:19.752 }' 00:17:19.752 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.752 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.752 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:19.752 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:19.752 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:19.753 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:19.753 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.753 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:20.010 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:20.010 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:20.010 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:20.011 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:20.011 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:20.011 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:20.011 00:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:20.269 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:20.269 "name": "pt3", 00:17:20.269 "aliases": [ 00:17:20.269 "00000000-0000-0000-0000-000000000003" 00:17:20.269 ], 00:17:20.269 "product_name": "passthru", 00:17:20.269 "block_size": 512, 00:17:20.269 "num_blocks": 65536, 00:17:20.269 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:20.269 "assigned_rate_limits": { 00:17:20.269 "rw_ios_per_sec": 0, 00:17:20.269 "rw_mbytes_per_sec": 0, 00:17:20.269 "r_mbytes_per_sec": 0, 00:17:20.269 "w_mbytes_per_sec": 0 00:17:20.269 }, 00:17:20.269 "claimed": true, 00:17:20.269 "claim_type": "exclusive_write", 00:17:20.269 "zoned": false, 00:17:20.269 "supported_io_types": { 00:17:20.269 "read": true, 00:17:20.269 "write": true, 00:17:20.269 "unmap": true, 00:17:20.269 "flush": true, 00:17:20.269 "reset": true, 00:17:20.269 "nvme_admin": false, 00:17:20.269 "nvme_io": false, 00:17:20.269 "nvme_io_md": false, 00:17:20.269 "write_zeroes": true, 00:17:20.269 "zcopy": true, 00:17:20.269 "get_zone_info": false, 00:17:20.269 "zone_management": false, 00:17:20.269 "zone_append": false, 00:17:20.269 "compare": false, 00:17:20.269 "compare_and_write": false, 00:17:20.269 "abort": true, 00:17:20.269 "seek_hole": false, 00:17:20.269 "seek_data": false, 00:17:20.269 "copy": true, 00:17:20.269 "nvme_iov_md": false 00:17:20.269 }, 00:17:20.269 "memory_domains": [ 00:17:20.269 { 00:17:20.269 "dma_device_id": "system", 00:17:20.269 "dma_device_type": 1 00:17:20.269 }, 00:17:20.269 { 00:17:20.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.269 "dma_device_type": 2 00:17:20.269 } 00:17:20.269 ], 00:17:20.269 "driver_specific": { 00:17:20.269 "passthru": { 00:17:20.269 "name": "pt3", 00:17:20.269 "base_bdev_name": "malloc3" 00:17:20.269 } 00:17:20.269 } 00:17:20.269 }' 00:17:20.269 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:20.269 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:20.269 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:20.269 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:20.269 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:20.528 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:20.528 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:20.528 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:20.528 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:20.528 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:20.528 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:20.528 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:20.528 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:20.528 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:20.786 [2024-07-16 00:12:07.667104] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:20.786 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' dc0e6dce-cd97-4679-9f85-8cd9511fb301 '!=' dc0e6dce-cd97-4679-9f85-8cd9511fb301 ']' 00:17:20.786 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:20.786 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:20.786 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:20.786 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:21.047 [2024-07-16 00:12:07.899460] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.047 00:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:21.358 00:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.358 "name": "raid_bdev1", 00:17:21.358 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:21.358 "strip_size_kb": 0, 00:17:21.358 "state": "online", 00:17:21.358 "raid_level": "raid1", 00:17:21.358 "superblock": true, 00:17:21.358 "num_base_bdevs": 3, 00:17:21.358 "num_base_bdevs_discovered": 2, 00:17:21.358 "num_base_bdevs_operational": 2, 00:17:21.358 "base_bdevs_list": [ 00:17:21.358 { 00:17:21.358 "name": null, 00:17:21.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.358 "is_configured": false, 00:17:21.358 "data_offset": 2048, 00:17:21.358 "data_size": 63488 00:17:21.358 }, 00:17:21.358 { 00:17:21.358 "name": "pt2", 00:17:21.358 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:21.358 "is_configured": true, 00:17:21.358 "data_offset": 2048, 00:17:21.358 "data_size": 63488 00:17:21.358 }, 00:17:21.358 { 00:17:21.358 "name": "pt3", 00:17:21.358 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:21.358 "is_configured": true, 00:17:21.358 "data_offset": 2048, 00:17:21.358 "data_size": 63488 00:17:21.358 } 00:17:21.358 ] 00:17:21.358 }' 00:17:21.358 00:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.358 00:12:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:21.926 00:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:22.184 [2024-07-16 00:12:08.974276] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:22.184 [2024-07-16 00:12:08.974306] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:22.184 [2024-07-16 00:12:08.974374] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:22.184 [2024-07-16 00:12:08.974437] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:22.184 [2024-07-16 00:12:08.974449] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd11c00 name raid_bdev1, state offline 00:17:22.184 00:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.184 00:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:22.443 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:22.443 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:22.443 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:22.443 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:22.443 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:22.701 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:22.701 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:22.701 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:22.959 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:22.959 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:22.959 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:22.959 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:22.959 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:23.218 [2024-07-16 00:12:09.952821] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:23.218 [2024-07-16 00:12:09.952866] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:23.218 [2024-07-16 00:12:09.952884] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb6f310 00:17:23.218 [2024-07-16 00:12:09.952897] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:23.218 [2024-07-16 00:12:09.954808] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:23.218 [2024-07-16 00:12:09.954841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:23.218 [2024-07-16 00:12:09.954914] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:23.218 [2024-07-16 00:12:09.954954] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:23.218 pt2 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.218 00:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:23.514 00:12:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.514 "name": "raid_bdev1", 00:17:23.514 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:23.514 "strip_size_kb": 0, 00:17:23.514 "state": "configuring", 00:17:23.514 "raid_level": "raid1", 00:17:23.514 "superblock": true, 00:17:23.514 "num_base_bdevs": 3, 00:17:23.514 "num_base_bdevs_discovered": 1, 00:17:23.514 "num_base_bdevs_operational": 2, 00:17:23.514 "base_bdevs_list": [ 00:17:23.514 { 00:17:23.514 "name": null, 00:17:23.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.514 "is_configured": false, 00:17:23.514 "data_offset": 2048, 00:17:23.514 "data_size": 63488 00:17:23.514 }, 00:17:23.514 { 00:17:23.514 "name": "pt2", 00:17:23.514 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:23.514 "is_configured": true, 00:17:23.514 "data_offset": 2048, 00:17:23.514 "data_size": 63488 00:17:23.514 }, 00:17:23.514 { 00:17:23.514 "name": null, 00:17:23.514 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:23.514 "is_configured": false, 00:17:23.514 "data_offset": 2048, 00:17:23.514 "data_size": 63488 00:17:23.514 } 00:17:23.514 ] 00:17:23.514 }' 00:17:23.514 00:12:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.514 00:12:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.082 00:12:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:24.082 00:12:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:24.082 00:12:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:17:24.082 00:12:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:24.649 [2024-07-16 00:12:11.368596] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:24.649 [2024-07-16 00:12:11.368646] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:24.649 [2024-07-16 00:12:11.368667] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb6dec0 00:17:24.649 [2024-07-16 00:12:11.368680] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:24.649 [2024-07-16 00:12:11.369078] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:24.649 [2024-07-16 00:12:11.369099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:24.649 [2024-07-16 00:12:11.369171] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:24.649 [2024-07-16 00:12:11.369193] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:24.649 [2024-07-16 00:12:11.369309] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd0fcc0 00:17:24.649 [2024-07-16 00:12:11.369320] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:24.649 [2024-07-16 00:12:11.369486] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd106d0 00:17:24.649 [2024-07-16 00:12:11.369626] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd0fcc0 00:17:24.649 [2024-07-16 00:12:11.369636] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd0fcc0 00:17:24.649 [2024-07-16 00:12:11.369744] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:24.649 pt3 00:17:24.649 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:24.649 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:24.649 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:24.649 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:24.649 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:24.649 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:24.649 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:24.649 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:24.650 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:24.650 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:24.650 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.650 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:25.217 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.217 "name": "raid_bdev1", 00:17:25.217 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:25.217 "strip_size_kb": 0, 00:17:25.217 "state": "online", 00:17:25.217 "raid_level": "raid1", 00:17:25.217 "superblock": true, 00:17:25.217 "num_base_bdevs": 3, 00:17:25.217 "num_base_bdevs_discovered": 2, 00:17:25.217 "num_base_bdevs_operational": 2, 00:17:25.217 "base_bdevs_list": [ 00:17:25.217 { 00:17:25.217 "name": null, 00:17:25.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.217 "is_configured": false, 00:17:25.217 "data_offset": 2048, 00:17:25.217 "data_size": 63488 00:17:25.217 }, 00:17:25.217 { 00:17:25.217 "name": "pt2", 00:17:25.217 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:25.217 "is_configured": true, 00:17:25.217 "data_offset": 2048, 00:17:25.217 "data_size": 63488 00:17:25.217 }, 00:17:25.217 { 00:17:25.217 "name": "pt3", 00:17:25.217 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:25.217 "is_configured": true, 00:17:25.217 "data_offset": 2048, 00:17:25.217 "data_size": 63488 00:17:25.217 } 00:17:25.217 ] 00:17:25.217 }' 00:17:25.217 00:12:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.217 00:12:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:26.153 00:12:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:26.411 [2024-07-16 00:12:13.269641] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:26.411 [2024-07-16 00:12:13.269665] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:26.411 [2024-07-16 00:12:13.269726] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:26.411 [2024-07-16 00:12:13.269785] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:26.411 [2024-07-16 00:12:13.269797] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd0fcc0 name raid_bdev1, state offline 00:17:26.411 00:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.411 00:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:26.978 00:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:26.978 00:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:26.978 00:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:17:26.978 00:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:17:26.978 00:12:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:27.547 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:28.115 [2024-07-16 00:12:14.817643] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:28.115 [2024-07-16 00:12:14.817690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:28.115 [2024-07-16 00:12:14.817708] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb6dec0 00:17:28.115 [2024-07-16 00:12:14.817727] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:28.115 [2024-07-16 00:12:14.819813] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:28.115 [2024-07-16 00:12:14.819843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:28.115 [2024-07-16 00:12:14.819917] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:28.115 [2024-07-16 00:12:14.819953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:28.115 [2024-07-16 00:12:14.820064] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:28.115 [2024-07-16 00:12:14.820078] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:28.115 [2024-07-16 00:12:14.820092] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd0ff40 name raid_bdev1, state configuring 00:17:28.115 [2024-07-16 00:12:14.820119] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:28.115 pt1 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.115 00:12:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:28.682 00:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.682 "name": "raid_bdev1", 00:17:28.682 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:28.682 "strip_size_kb": 0, 00:17:28.682 "state": "configuring", 00:17:28.682 "raid_level": "raid1", 00:17:28.682 "superblock": true, 00:17:28.682 "num_base_bdevs": 3, 00:17:28.682 "num_base_bdevs_discovered": 1, 00:17:28.682 "num_base_bdevs_operational": 2, 00:17:28.682 "base_bdevs_list": [ 00:17:28.682 { 00:17:28.682 "name": null, 00:17:28.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.682 "is_configured": false, 00:17:28.682 "data_offset": 2048, 00:17:28.682 "data_size": 63488 00:17:28.682 }, 00:17:28.682 { 00:17:28.682 "name": "pt2", 00:17:28.682 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:28.682 "is_configured": true, 00:17:28.682 "data_offset": 2048, 00:17:28.682 "data_size": 63488 00:17:28.682 }, 00:17:28.682 { 00:17:28.682 "name": null, 00:17:28.682 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:28.682 "is_configured": false, 00:17:28.682 "data_offset": 2048, 00:17:28.682 "data_size": 63488 00:17:28.682 } 00:17:28.682 ] 00:17:28.682 }' 00:17:28.682 00:12:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.682 00:12:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.619 00:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:29.619 00:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:29.877 00:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:17:29.878 00:12:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:30.136 [2024-07-16 00:12:17.015483] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:30.136 [2024-07-16 00:12:17.015539] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:30.136 [2024-07-16 00:12:17.015558] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb710c0 00:17:30.136 [2024-07-16 00:12:17.015572] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:30.136 [2024-07-16 00:12:17.015965] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:30.136 [2024-07-16 00:12:17.015986] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:30.136 [2024-07-16 00:12:17.016057] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:30.136 [2024-07-16 00:12:17.016078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:30.136 [2024-07-16 00:12:17.016195] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb71a40 00:17:30.136 [2024-07-16 00:12:17.016206] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:30.136 [2024-07-16 00:12:17.016381] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd106c0 00:17:30.136 [2024-07-16 00:12:17.016521] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb71a40 00:17:30.136 [2024-07-16 00:12:17.016532] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb71a40 00:17:30.136 [2024-07-16 00:12:17.016640] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:30.136 pt3 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.136 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:30.395 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.396 "name": "raid_bdev1", 00:17:30.396 "uuid": "dc0e6dce-cd97-4679-9f85-8cd9511fb301", 00:17:30.396 "strip_size_kb": 0, 00:17:30.396 "state": "online", 00:17:30.396 "raid_level": "raid1", 00:17:30.396 "superblock": true, 00:17:30.396 "num_base_bdevs": 3, 00:17:30.396 "num_base_bdevs_discovered": 2, 00:17:30.396 "num_base_bdevs_operational": 2, 00:17:30.396 "base_bdevs_list": [ 00:17:30.396 { 00:17:30.396 "name": null, 00:17:30.396 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:30.396 "is_configured": false, 00:17:30.396 "data_offset": 2048, 00:17:30.396 "data_size": 63488 00:17:30.396 }, 00:17:30.396 { 00:17:30.396 "name": "pt2", 00:17:30.396 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:30.396 "is_configured": true, 00:17:30.396 "data_offset": 2048, 00:17:30.396 "data_size": 63488 00:17:30.396 }, 00:17:30.396 { 00:17:30.396 "name": "pt3", 00:17:30.396 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:30.396 "is_configured": true, 00:17:30.396 "data_offset": 2048, 00:17:30.396 "data_size": 63488 00:17:30.396 } 00:17:30.396 ] 00:17:30.396 }' 00:17:30.396 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.396 00:12:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.963 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:30.963 00:12:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:31.221 00:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:31.222 00:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:31.222 00:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:31.789 [2024-07-16 00:12:18.611987] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:31.789 00:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' dc0e6dce-cd97-4679-9f85-8cd9511fb301 '!=' dc0e6dce-cd97-4679-9f85-8cd9511fb301 ']' 00:17:31.789 00:12:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3544479 00:17:31.789 00:12:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3544479 ']' 00:17:31.789 00:12:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3544479 00:17:31.789 00:12:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:31.789 00:12:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:31.789 00:12:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3544479 00:17:31.789 00:12:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:31.789 00:12:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:31.789 00:12:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3544479' 00:17:31.789 killing process with pid 3544479 00:17:31.790 00:12:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3544479 00:17:31.790 [2024-07-16 00:12:18.696975] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:31.790 [2024-07-16 00:12:18.697032] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:31.790 [2024-07-16 00:12:18.697092] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:31.790 [2024-07-16 00:12:18.697104] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb71a40 name raid_bdev1, state offline 00:17:31.790 00:12:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3544479 00:17:32.048 [2024-07-16 00:12:18.746760] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:32.307 00:12:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:32.307 00:17:32.307 real 0m26.713s 00:17:32.307 user 0m48.965s 00:17:32.307 sys 0m4.574s 00:17:32.307 00:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:32.307 00:12:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:32.307 ************************************ 00:17:32.307 END TEST raid_superblock_test 00:17:32.307 ************************************ 00:17:32.307 00:12:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:32.307 00:12:19 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:17:32.307 00:12:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:32.307 00:12:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:32.307 00:12:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:32.307 ************************************ 00:17:32.307 START TEST raid_read_error_test 00:17:32.307 ************************************ 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.7oBZEgReRV 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3548939 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3548939 /var/tmp/spdk-raid.sock 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3548939 ']' 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:32.307 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:32.307 00:12:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:32.307 [2024-07-16 00:12:19.241491] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:17:32.307 [2024-07-16 00:12:19.241562] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3548939 ] 00:17:32.567 [2024-07-16 00:12:19.375537] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:32.567 [2024-07-16 00:12:19.474292] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:32.826 [2024-07-16 00:12:19.542965] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:32.826 [2024-07-16 00:12:19.543004] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:33.393 00:12:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:33.393 00:12:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:33.393 00:12:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:33.393 00:12:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:33.652 BaseBdev1_malloc 00:17:33.652 00:12:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:33.910 true 00:17:33.910 00:12:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:34.168 [2024-07-16 00:12:20.897766] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:34.168 [2024-07-16 00:12:20.897808] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:34.168 [2024-07-16 00:12:20.897828] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x165c0d0 00:17:34.168 [2024-07-16 00:12:20.897840] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:34.168 [2024-07-16 00:12:20.899533] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:34.168 [2024-07-16 00:12:20.899561] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:34.168 BaseBdev1 00:17:34.168 00:12:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:34.168 00:12:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:34.427 BaseBdev2_malloc 00:17:34.427 00:12:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:34.685 true 00:17:34.685 00:12:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:34.685 [2024-07-16 00:12:21.632550] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:34.685 [2024-07-16 00:12:21.632595] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:34.685 [2024-07-16 00:12:21.632614] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1660910 00:17:34.685 [2024-07-16 00:12:21.632626] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:34.685 [2024-07-16 00:12:21.633999] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:34.685 [2024-07-16 00:12:21.634026] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:34.944 BaseBdev2 00:17:34.944 00:12:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:34.944 00:12:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:34.944 BaseBdev3_malloc 00:17:35.239 00:12:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:35.239 true 00:17:35.239 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:35.518 [2024-07-16 00:12:22.375072] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:35.518 [2024-07-16 00:12:22.375122] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:35.518 [2024-07-16 00:12:22.375143] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1662bd0 00:17:35.518 [2024-07-16 00:12:22.375156] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:35.518 [2024-07-16 00:12:22.376707] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:35.518 [2024-07-16 00:12:22.376736] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:35.518 BaseBdev3 00:17:35.518 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:35.776 [2024-07-16 00:12:22.615726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:35.776 [2024-07-16 00:12:22.616996] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:35.776 [2024-07-16 00:12:22.617066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:35.776 [2024-07-16 00:12:22.617283] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1664280 00:17:35.776 [2024-07-16 00:12:22.617295] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:35.776 [2024-07-16 00:12:22.617491] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1663e20 00:17:35.776 [2024-07-16 00:12:22.617641] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1664280 00:17:35.776 [2024-07-16 00:12:22.617652] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1664280 00:17:35.776 [2024-07-16 00:12:22.617754] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.776 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:36.035 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.035 "name": "raid_bdev1", 00:17:36.035 "uuid": "f7b4d2d4-3716-46bd-a8bc-cc7cb13c1dce", 00:17:36.035 "strip_size_kb": 0, 00:17:36.035 "state": "online", 00:17:36.035 "raid_level": "raid1", 00:17:36.035 "superblock": true, 00:17:36.035 "num_base_bdevs": 3, 00:17:36.035 "num_base_bdevs_discovered": 3, 00:17:36.035 "num_base_bdevs_operational": 3, 00:17:36.035 "base_bdevs_list": [ 00:17:36.035 { 00:17:36.035 "name": "BaseBdev1", 00:17:36.035 "uuid": "fee1a67f-56c5-5a3a-befd-16b5697bf95a", 00:17:36.035 "is_configured": true, 00:17:36.035 "data_offset": 2048, 00:17:36.035 "data_size": 63488 00:17:36.035 }, 00:17:36.035 { 00:17:36.035 "name": "BaseBdev2", 00:17:36.035 "uuid": "80abedd3-eab7-57cf-bac2-3bc432636759", 00:17:36.035 "is_configured": true, 00:17:36.035 "data_offset": 2048, 00:17:36.035 "data_size": 63488 00:17:36.035 }, 00:17:36.035 { 00:17:36.035 "name": "BaseBdev3", 00:17:36.035 "uuid": "c6fd0b8f-c263-5706-94b4-b8fd29456310", 00:17:36.035 "is_configured": true, 00:17:36.035 "data_offset": 2048, 00:17:36.035 "data_size": 63488 00:17:36.035 } 00:17:36.035 ] 00:17:36.035 }' 00:17:36.035 00:12:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.035 00:12:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:36.968 00:12:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:36.968 00:12:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:36.968 [2024-07-16 00:12:23.859310] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14b1e00 00:17:37.926 00:12:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.183 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:38.441 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.441 "name": "raid_bdev1", 00:17:38.441 "uuid": "f7b4d2d4-3716-46bd-a8bc-cc7cb13c1dce", 00:17:38.441 "strip_size_kb": 0, 00:17:38.441 "state": "online", 00:17:38.441 "raid_level": "raid1", 00:17:38.441 "superblock": true, 00:17:38.441 "num_base_bdevs": 3, 00:17:38.441 "num_base_bdevs_discovered": 3, 00:17:38.441 "num_base_bdevs_operational": 3, 00:17:38.441 "base_bdevs_list": [ 00:17:38.441 { 00:17:38.441 "name": "BaseBdev1", 00:17:38.441 "uuid": "fee1a67f-56c5-5a3a-befd-16b5697bf95a", 00:17:38.441 "is_configured": true, 00:17:38.441 "data_offset": 2048, 00:17:38.441 "data_size": 63488 00:17:38.441 }, 00:17:38.441 { 00:17:38.441 "name": "BaseBdev2", 00:17:38.441 "uuid": "80abedd3-eab7-57cf-bac2-3bc432636759", 00:17:38.441 "is_configured": true, 00:17:38.441 "data_offset": 2048, 00:17:38.441 "data_size": 63488 00:17:38.441 }, 00:17:38.441 { 00:17:38.441 "name": "BaseBdev3", 00:17:38.441 "uuid": "c6fd0b8f-c263-5706-94b4-b8fd29456310", 00:17:38.441 "is_configured": true, 00:17:38.441 "data_offset": 2048, 00:17:38.441 "data_size": 63488 00:17:38.441 } 00:17:38.441 ] 00:17:38.441 }' 00:17:38.441 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.441 00:12:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.007 00:12:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:39.264 [2024-07-16 00:12:26.119761] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:39.264 [2024-07-16 00:12:26.119801] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:39.264 [2024-07-16 00:12:26.122949] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:39.265 [2024-07-16 00:12:26.122985] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:39.265 [2024-07-16 00:12:26.123081] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:39.265 [2024-07-16 00:12:26.123093] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1664280 name raid_bdev1, state offline 00:17:39.265 0 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3548939 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3548939 ']' 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3548939 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3548939 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3548939' 00:17:39.265 killing process with pid 3548939 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3548939 00:17:39.265 [2024-07-16 00:12:26.191469] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:39.265 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3548939 00:17:39.265 [2024-07-16 00:12:26.212168] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:39.522 00:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.7oBZEgReRV 00:17:39.522 00:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:39.522 00:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:39.522 00:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:39.522 00:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:39.522 00:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:39.522 00:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:39.522 00:12:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:39.522 00:17:39.522 real 0m7.275s 00:17:39.522 user 0m11.546s 00:17:39.522 sys 0m1.337s 00:17:39.522 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:39.522 00:12:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.522 ************************************ 00:17:39.522 END TEST raid_read_error_test 00:17:39.522 ************************************ 00:17:39.781 00:12:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:39.781 00:12:26 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:17:39.781 00:12:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:39.781 00:12:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:39.781 00:12:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:39.781 ************************************ 00:17:39.781 START TEST raid_write_error_test 00:17:39.781 ************************************ 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.IgNfkqJ24D 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3549993 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3549993 /var/tmp/spdk-raid.sock 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3549993 ']' 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:39.781 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:39.781 00:12:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.781 [2024-07-16 00:12:26.598406] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:17:39.781 [2024-07-16 00:12:26.598461] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3549993 ] 00:17:39.781 [2024-07-16 00:12:26.711952] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:40.039 [2024-07-16 00:12:26.813648] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:40.039 [2024-07-16 00:12:26.876922] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:40.039 [2024-07-16 00:12:26.876965] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:40.605 00:12:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:40.605 00:12:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:40.605 00:12:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:40.605 00:12:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:40.863 BaseBdev1_malloc 00:17:40.863 00:12:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:41.121 true 00:17:41.121 00:12:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:41.380 [2024-07-16 00:12:28.214584] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:41.380 [2024-07-16 00:12:28.214634] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:41.380 [2024-07-16 00:12:28.214655] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe340d0 00:17:41.380 [2024-07-16 00:12:28.214673] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:41.380 [2024-07-16 00:12:28.216477] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:41.380 [2024-07-16 00:12:28.216523] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:41.380 BaseBdev1 00:17:41.380 00:12:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:41.380 00:12:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:41.638 BaseBdev2_malloc 00:17:41.638 00:12:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:41.638 true 00:17:41.896 00:12:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:41.896 [2024-07-16 00:12:28.816814] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:41.896 [2024-07-16 00:12:28.816860] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:41.896 [2024-07-16 00:12:28.816878] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe38910 00:17:41.896 [2024-07-16 00:12:28.816890] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:41.896 [2024-07-16 00:12:28.818253] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:41.897 [2024-07-16 00:12:28.818279] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:41.897 BaseBdev2 00:17:41.897 00:12:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:41.897 00:12:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:42.154 BaseBdev3_malloc 00:17:42.154 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:42.413 true 00:17:42.413 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:42.671 [2024-07-16 00:12:29.567319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:42.671 [2024-07-16 00:12:29.567360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:42.671 [2024-07-16 00:12:29.567379] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe3abd0 00:17:42.671 [2024-07-16 00:12:29.567392] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:42.671 [2024-07-16 00:12:29.568815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:42.671 [2024-07-16 00:12:29.568845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:42.671 BaseBdev3 00:17:42.671 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:42.930 [2024-07-16 00:12:29.868149] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:42.930 [2024-07-16 00:12:29.869496] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:42.930 [2024-07-16 00:12:29.869565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:42.930 [2024-07-16 00:12:29.869780] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe3c280 00:17:42.930 [2024-07-16 00:12:29.869792] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:42.930 [2024-07-16 00:12:29.870003] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe3be20 00:17:42.930 [2024-07-16 00:12:29.870163] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe3c280 00:17:42.930 [2024-07-16 00:12:29.870178] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe3c280 00:17:42.930 [2024-07-16 00:12:29.870287] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:43.188 00:12:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.188 00:12:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:43.188 "name": "raid_bdev1", 00:17:43.188 "uuid": "d32a536c-cc68-4b70-affe-79d4fa758adf", 00:17:43.188 "strip_size_kb": 0, 00:17:43.188 "state": "online", 00:17:43.188 "raid_level": "raid1", 00:17:43.188 "superblock": true, 00:17:43.188 "num_base_bdevs": 3, 00:17:43.188 "num_base_bdevs_discovered": 3, 00:17:43.188 "num_base_bdevs_operational": 3, 00:17:43.188 "base_bdevs_list": [ 00:17:43.188 { 00:17:43.188 "name": "BaseBdev1", 00:17:43.188 "uuid": "18775cda-a403-5148-9233-d53daa486cd0", 00:17:43.188 "is_configured": true, 00:17:43.188 "data_offset": 2048, 00:17:43.188 "data_size": 63488 00:17:43.188 }, 00:17:43.188 { 00:17:43.188 "name": "BaseBdev2", 00:17:43.188 "uuid": "991217e9-3b07-5a53-bac2-4e98b8fed7d0", 00:17:43.188 "is_configured": true, 00:17:43.188 "data_offset": 2048, 00:17:43.188 "data_size": 63488 00:17:43.188 }, 00:17:43.188 { 00:17:43.188 "name": "BaseBdev3", 00:17:43.188 "uuid": "ca128c41-cdc2-54bb-9852-94cc47ec76af", 00:17:43.188 "is_configured": true, 00:17:43.188 "data_offset": 2048, 00:17:43.188 "data_size": 63488 00:17:43.188 } 00:17:43.188 ] 00:17:43.188 }' 00:17:43.188 00:12:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:43.188 00:12:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.755 00:12:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:43.755 00:12:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:44.013 [2024-07-16 00:12:30.778849] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc89e00 00:17:44.949 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:45.208 [2024-07-16 00:12:31.909809] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:45.208 [2024-07-16 00:12:31.909863] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:45.208 [2024-07-16 00:12:31.910063] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xc89e00 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.208 00:12:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:45.208 00:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.208 "name": "raid_bdev1", 00:17:45.208 "uuid": "d32a536c-cc68-4b70-affe-79d4fa758adf", 00:17:45.208 "strip_size_kb": 0, 00:17:45.208 "state": "online", 00:17:45.208 "raid_level": "raid1", 00:17:45.208 "superblock": true, 00:17:45.208 "num_base_bdevs": 3, 00:17:45.208 "num_base_bdevs_discovered": 2, 00:17:45.208 "num_base_bdevs_operational": 2, 00:17:45.208 "base_bdevs_list": [ 00:17:45.208 { 00:17:45.208 "name": null, 00:17:45.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.208 "is_configured": false, 00:17:45.208 "data_offset": 2048, 00:17:45.208 "data_size": 63488 00:17:45.208 }, 00:17:45.208 { 00:17:45.208 "name": "BaseBdev2", 00:17:45.208 "uuid": "991217e9-3b07-5a53-bac2-4e98b8fed7d0", 00:17:45.208 "is_configured": true, 00:17:45.208 "data_offset": 2048, 00:17:45.208 "data_size": 63488 00:17:45.208 }, 00:17:45.208 { 00:17:45.208 "name": "BaseBdev3", 00:17:45.208 "uuid": "ca128c41-cdc2-54bb-9852-94cc47ec76af", 00:17:45.208 "is_configured": true, 00:17:45.208 "data_offset": 2048, 00:17:45.208 "data_size": 63488 00:17:45.208 } 00:17:45.208 ] 00:17:45.208 }' 00:17:45.208 00:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.208 00:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.777 00:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:46.035 [2024-07-16 00:12:32.945122] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:46.036 [2024-07-16 00:12:32.945160] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:46.036 [2024-07-16 00:12:32.948293] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:46.036 [2024-07-16 00:12:32.948325] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:46.036 [2024-07-16 00:12:32.948396] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:46.036 [2024-07-16 00:12:32.948408] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe3c280 name raid_bdev1, state offline 00:17:46.036 0 00:17:46.036 00:12:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3549993 00:17:46.036 00:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3549993 ']' 00:17:46.036 00:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3549993 00:17:46.295 00:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:46.295 00:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:46.295 00:12:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3549993 00:17:46.295 00:12:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:46.295 00:12:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:46.295 00:12:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3549993' 00:17:46.295 killing process with pid 3549993 00:17:46.295 00:12:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3549993 00:17:46.295 [2024-07-16 00:12:33.026580] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:46.295 00:12:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3549993 00:17:46.295 [2024-07-16 00:12:33.047297] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:46.554 00:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.IgNfkqJ24D 00:17:46.554 00:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:46.554 00:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:46.554 00:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:46.554 00:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:46.554 00:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:46.554 00:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:46.554 00:12:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:46.554 00:17:46.554 real 0m6.744s 00:17:46.554 user 0m10.601s 00:17:46.554 sys 0m1.205s 00:17:46.554 00:12:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:46.554 00:12:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.554 ************************************ 00:17:46.554 END TEST raid_write_error_test 00:17:46.554 ************************************ 00:17:46.554 00:12:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:46.554 00:12:33 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:17:46.554 00:12:33 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:46.554 00:12:33 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:17:46.554 00:12:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:46.554 00:12:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:46.554 00:12:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:46.554 ************************************ 00:17:46.554 START TEST raid_state_function_test 00:17:46.554 ************************************ 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3551013 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3551013' 00:17:46.554 Process raid pid: 3551013 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3551013 /var/tmp/spdk-raid.sock 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3551013 ']' 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:46.554 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:46.554 00:12:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.554 [2024-07-16 00:12:33.471710] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:17:46.554 [2024-07-16 00:12:33.471845] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:46.812 [2024-07-16 00:12:33.668049] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:47.069 [2024-07-16 00:12:33.767464] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:47.069 [2024-07-16 00:12:33.828587] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:47.069 [2024-07-16 00:12:33.828616] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:47.636 00:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:47.636 00:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:47.636 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:47.636 [2024-07-16 00:12:34.582718] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:47.636 [2024-07-16 00:12:34.582761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:47.636 [2024-07-16 00:12:34.582772] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:47.636 [2024-07-16 00:12:34.582784] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:47.636 [2024-07-16 00:12:34.582793] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:47.636 [2024-07-16 00:12:34.582809] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:47.636 [2024-07-16 00:12:34.582818] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:47.636 [2024-07-16 00:12:34.582829] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.895 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:48.154 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:48.154 "name": "Existed_Raid", 00:17:48.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.154 "strip_size_kb": 64, 00:17:48.154 "state": "configuring", 00:17:48.154 "raid_level": "raid0", 00:17:48.154 "superblock": false, 00:17:48.154 "num_base_bdevs": 4, 00:17:48.154 "num_base_bdevs_discovered": 0, 00:17:48.154 "num_base_bdevs_operational": 4, 00:17:48.154 "base_bdevs_list": [ 00:17:48.154 { 00:17:48.154 "name": "BaseBdev1", 00:17:48.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.154 "is_configured": false, 00:17:48.154 "data_offset": 0, 00:17:48.154 "data_size": 0 00:17:48.154 }, 00:17:48.154 { 00:17:48.154 "name": "BaseBdev2", 00:17:48.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.154 "is_configured": false, 00:17:48.154 "data_offset": 0, 00:17:48.154 "data_size": 0 00:17:48.154 }, 00:17:48.154 { 00:17:48.154 "name": "BaseBdev3", 00:17:48.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.154 "is_configured": false, 00:17:48.154 "data_offset": 0, 00:17:48.154 "data_size": 0 00:17:48.154 }, 00:17:48.154 { 00:17:48.154 "name": "BaseBdev4", 00:17:48.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.154 "is_configured": false, 00:17:48.154 "data_offset": 0, 00:17:48.154 "data_size": 0 00:17:48.154 } 00:17:48.154 ] 00:17:48.154 }' 00:17:48.154 00:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:48.154 00:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:48.721 00:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:48.721 [2024-07-16 00:12:35.669451] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:48.721 [2024-07-16 00:12:35.669481] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x168daa0 name Existed_Raid, state configuring 00:17:48.979 00:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:48.979 [2024-07-16 00:12:35.918129] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:48.979 [2024-07-16 00:12:35.918156] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:48.979 [2024-07-16 00:12:35.918165] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:48.979 [2024-07-16 00:12:35.918177] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:48.979 [2024-07-16 00:12:35.918193] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:48.979 [2024-07-16 00:12:35.918204] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:48.979 [2024-07-16 00:12:35.918213] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:48.979 [2024-07-16 00:12:35.918224] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:49.238 00:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:49.238 [2024-07-16 00:12:36.176657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:49.238 BaseBdev1 00:17:49.496 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:49.496 00:12:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:49.496 00:12:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:49.496 00:12:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:49.496 00:12:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:49.496 00:12:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:49.496 00:12:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:49.757 [ 00:17:49.757 { 00:17:49.757 "name": "BaseBdev1", 00:17:49.757 "aliases": [ 00:17:49.757 "6fe07169-2920-4983-ba7a-e1db0f5216d9" 00:17:49.757 ], 00:17:49.757 "product_name": "Malloc disk", 00:17:49.757 "block_size": 512, 00:17:49.757 "num_blocks": 65536, 00:17:49.757 "uuid": "6fe07169-2920-4983-ba7a-e1db0f5216d9", 00:17:49.757 "assigned_rate_limits": { 00:17:49.757 "rw_ios_per_sec": 0, 00:17:49.757 "rw_mbytes_per_sec": 0, 00:17:49.757 "r_mbytes_per_sec": 0, 00:17:49.757 "w_mbytes_per_sec": 0 00:17:49.757 }, 00:17:49.757 "claimed": true, 00:17:49.757 "claim_type": "exclusive_write", 00:17:49.757 "zoned": false, 00:17:49.757 "supported_io_types": { 00:17:49.757 "read": true, 00:17:49.757 "write": true, 00:17:49.757 "unmap": true, 00:17:49.757 "flush": true, 00:17:49.757 "reset": true, 00:17:49.757 "nvme_admin": false, 00:17:49.757 "nvme_io": false, 00:17:49.757 "nvme_io_md": false, 00:17:49.757 "write_zeroes": true, 00:17:49.757 "zcopy": true, 00:17:49.757 "get_zone_info": false, 00:17:49.757 "zone_management": false, 00:17:49.757 "zone_append": false, 00:17:49.757 "compare": false, 00:17:49.757 "compare_and_write": false, 00:17:49.757 "abort": true, 00:17:49.757 "seek_hole": false, 00:17:49.757 "seek_data": false, 00:17:49.757 "copy": true, 00:17:49.757 "nvme_iov_md": false 00:17:49.757 }, 00:17:49.757 "memory_domains": [ 00:17:49.757 { 00:17:49.757 "dma_device_id": "system", 00:17:49.757 "dma_device_type": 1 00:17:49.757 }, 00:17:49.757 { 00:17:49.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.757 "dma_device_type": 2 00:17:49.757 } 00:17:49.757 ], 00:17:49.757 "driver_specific": {} 00:17:49.757 } 00:17:49.757 ] 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.757 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:50.049 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.049 "name": "Existed_Raid", 00:17:50.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.049 "strip_size_kb": 64, 00:17:50.049 "state": "configuring", 00:17:50.049 "raid_level": "raid0", 00:17:50.049 "superblock": false, 00:17:50.049 "num_base_bdevs": 4, 00:17:50.049 "num_base_bdevs_discovered": 1, 00:17:50.049 "num_base_bdevs_operational": 4, 00:17:50.049 "base_bdevs_list": [ 00:17:50.049 { 00:17:50.049 "name": "BaseBdev1", 00:17:50.049 "uuid": "6fe07169-2920-4983-ba7a-e1db0f5216d9", 00:17:50.049 "is_configured": true, 00:17:50.049 "data_offset": 0, 00:17:50.049 "data_size": 65536 00:17:50.049 }, 00:17:50.049 { 00:17:50.049 "name": "BaseBdev2", 00:17:50.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.049 "is_configured": false, 00:17:50.049 "data_offset": 0, 00:17:50.049 "data_size": 0 00:17:50.049 }, 00:17:50.049 { 00:17:50.049 "name": "BaseBdev3", 00:17:50.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.049 "is_configured": false, 00:17:50.049 "data_offset": 0, 00:17:50.049 "data_size": 0 00:17:50.049 }, 00:17:50.049 { 00:17:50.049 "name": "BaseBdev4", 00:17:50.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.049 "is_configured": false, 00:17:50.049 "data_offset": 0, 00:17:50.049 "data_size": 0 00:17:50.049 } 00:17:50.049 ] 00:17:50.049 }' 00:17:50.049 00:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.049 00:12:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.987 00:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:50.987 [2024-07-16 00:12:37.768901] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:50.987 [2024-07-16 00:12:37.768943] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x168d310 name Existed_Raid, state configuring 00:17:50.987 00:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:51.247 [2024-07-16 00:12:38.017591] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:51.247 [2024-07-16 00:12:38.019025] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:51.247 [2024-07-16 00:12:38.019055] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:51.247 [2024-07-16 00:12:38.019065] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:51.247 [2024-07-16 00:12:38.019077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:51.247 [2024-07-16 00:12:38.019086] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:51.247 [2024-07-16 00:12:38.019098] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.247 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:51.506 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:51.506 "name": "Existed_Raid", 00:17:51.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.506 "strip_size_kb": 64, 00:17:51.506 "state": "configuring", 00:17:51.506 "raid_level": "raid0", 00:17:51.506 "superblock": false, 00:17:51.506 "num_base_bdevs": 4, 00:17:51.506 "num_base_bdevs_discovered": 1, 00:17:51.506 "num_base_bdevs_operational": 4, 00:17:51.506 "base_bdevs_list": [ 00:17:51.506 { 00:17:51.506 "name": "BaseBdev1", 00:17:51.506 "uuid": "6fe07169-2920-4983-ba7a-e1db0f5216d9", 00:17:51.506 "is_configured": true, 00:17:51.506 "data_offset": 0, 00:17:51.506 "data_size": 65536 00:17:51.506 }, 00:17:51.506 { 00:17:51.506 "name": "BaseBdev2", 00:17:51.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.506 "is_configured": false, 00:17:51.506 "data_offset": 0, 00:17:51.506 "data_size": 0 00:17:51.506 }, 00:17:51.506 { 00:17:51.506 "name": "BaseBdev3", 00:17:51.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.506 "is_configured": false, 00:17:51.506 "data_offset": 0, 00:17:51.506 "data_size": 0 00:17:51.506 }, 00:17:51.506 { 00:17:51.506 "name": "BaseBdev4", 00:17:51.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.506 "is_configured": false, 00:17:51.506 "data_offset": 0, 00:17:51.506 "data_size": 0 00:17:51.506 } 00:17:51.506 ] 00:17:51.506 }' 00:17:51.506 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:51.506 00:12:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.073 00:12:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:52.332 [2024-07-16 00:12:39.140654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:52.332 BaseBdev2 00:17:52.332 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:52.332 00:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:52.332 00:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:52.332 00:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:52.332 00:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:52.332 00:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:52.332 00:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.591 00:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:52.855 [ 00:17:52.855 { 00:17:52.855 "name": "BaseBdev2", 00:17:52.856 "aliases": [ 00:17:52.856 "e2120f1f-69f2-4c17-a395-507f52556c68" 00:17:52.856 ], 00:17:52.856 "product_name": "Malloc disk", 00:17:52.856 "block_size": 512, 00:17:52.856 "num_blocks": 65536, 00:17:52.856 "uuid": "e2120f1f-69f2-4c17-a395-507f52556c68", 00:17:52.856 "assigned_rate_limits": { 00:17:52.856 "rw_ios_per_sec": 0, 00:17:52.856 "rw_mbytes_per_sec": 0, 00:17:52.856 "r_mbytes_per_sec": 0, 00:17:52.856 "w_mbytes_per_sec": 0 00:17:52.856 }, 00:17:52.856 "claimed": true, 00:17:52.856 "claim_type": "exclusive_write", 00:17:52.856 "zoned": false, 00:17:52.856 "supported_io_types": { 00:17:52.856 "read": true, 00:17:52.856 "write": true, 00:17:52.856 "unmap": true, 00:17:52.856 "flush": true, 00:17:52.856 "reset": true, 00:17:52.856 "nvme_admin": false, 00:17:52.856 "nvme_io": false, 00:17:52.856 "nvme_io_md": false, 00:17:52.856 "write_zeroes": true, 00:17:52.856 "zcopy": true, 00:17:52.856 "get_zone_info": false, 00:17:52.856 "zone_management": false, 00:17:52.856 "zone_append": false, 00:17:52.856 "compare": false, 00:17:52.856 "compare_and_write": false, 00:17:52.856 "abort": true, 00:17:52.856 "seek_hole": false, 00:17:52.856 "seek_data": false, 00:17:52.856 "copy": true, 00:17:52.856 "nvme_iov_md": false 00:17:52.856 }, 00:17:52.856 "memory_domains": [ 00:17:52.856 { 00:17:52.856 "dma_device_id": "system", 00:17:52.856 "dma_device_type": 1 00:17:52.856 }, 00:17:52.856 { 00:17:52.856 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.856 "dma_device_type": 2 00:17:52.856 } 00:17:52.856 ], 00:17:52.856 "driver_specific": {} 00:17:52.856 } 00:17:52.856 ] 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.856 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.113 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.113 "name": "Existed_Raid", 00:17:53.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.113 "strip_size_kb": 64, 00:17:53.113 "state": "configuring", 00:17:53.113 "raid_level": "raid0", 00:17:53.113 "superblock": false, 00:17:53.113 "num_base_bdevs": 4, 00:17:53.113 "num_base_bdevs_discovered": 2, 00:17:53.113 "num_base_bdevs_operational": 4, 00:17:53.113 "base_bdevs_list": [ 00:17:53.113 { 00:17:53.113 "name": "BaseBdev1", 00:17:53.113 "uuid": "6fe07169-2920-4983-ba7a-e1db0f5216d9", 00:17:53.113 "is_configured": true, 00:17:53.113 "data_offset": 0, 00:17:53.113 "data_size": 65536 00:17:53.113 }, 00:17:53.113 { 00:17:53.113 "name": "BaseBdev2", 00:17:53.113 "uuid": "e2120f1f-69f2-4c17-a395-507f52556c68", 00:17:53.113 "is_configured": true, 00:17:53.113 "data_offset": 0, 00:17:53.113 "data_size": 65536 00:17:53.113 }, 00:17:53.113 { 00:17:53.113 "name": "BaseBdev3", 00:17:53.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.113 "is_configured": false, 00:17:53.113 "data_offset": 0, 00:17:53.114 "data_size": 0 00:17:53.114 }, 00:17:53.114 { 00:17:53.114 "name": "BaseBdev4", 00:17:53.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.114 "is_configured": false, 00:17:53.114 "data_offset": 0, 00:17:53.114 "data_size": 0 00:17:53.114 } 00:17:53.114 ] 00:17:53.114 }' 00:17:53.114 00:12:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.114 00:12:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:53.680 00:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:53.938 [2024-07-16 00:12:40.780488] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:53.938 BaseBdev3 00:17:53.938 00:12:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:53.938 00:12:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:53.938 00:12:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:53.938 00:12:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:53.938 00:12:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:53.938 00:12:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:53.938 00:12:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:54.196 00:12:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:54.454 [ 00:17:54.454 { 00:17:54.454 "name": "BaseBdev3", 00:17:54.454 "aliases": [ 00:17:54.454 "eb9dc7ce-8a9f-4487-ba6b-0cf92f1f8ee9" 00:17:54.454 ], 00:17:54.454 "product_name": "Malloc disk", 00:17:54.454 "block_size": 512, 00:17:54.454 "num_blocks": 65536, 00:17:54.454 "uuid": "eb9dc7ce-8a9f-4487-ba6b-0cf92f1f8ee9", 00:17:54.454 "assigned_rate_limits": { 00:17:54.454 "rw_ios_per_sec": 0, 00:17:54.454 "rw_mbytes_per_sec": 0, 00:17:54.454 "r_mbytes_per_sec": 0, 00:17:54.454 "w_mbytes_per_sec": 0 00:17:54.454 }, 00:17:54.454 "claimed": true, 00:17:54.454 "claim_type": "exclusive_write", 00:17:54.454 "zoned": false, 00:17:54.454 "supported_io_types": { 00:17:54.454 "read": true, 00:17:54.454 "write": true, 00:17:54.454 "unmap": true, 00:17:54.454 "flush": true, 00:17:54.454 "reset": true, 00:17:54.454 "nvme_admin": false, 00:17:54.454 "nvme_io": false, 00:17:54.454 "nvme_io_md": false, 00:17:54.454 "write_zeroes": true, 00:17:54.454 "zcopy": true, 00:17:54.454 "get_zone_info": false, 00:17:54.454 "zone_management": false, 00:17:54.454 "zone_append": false, 00:17:54.454 "compare": false, 00:17:54.454 "compare_and_write": false, 00:17:54.454 "abort": true, 00:17:54.454 "seek_hole": false, 00:17:54.454 "seek_data": false, 00:17:54.454 "copy": true, 00:17:54.454 "nvme_iov_md": false 00:17:54.454 }, 00:17:54.454 "memory_domains": [ 00:17:54.454 { 00:17:54.454 "dma_device_id": "system", 00:17:54.454 "dma_device_type": 1 00:17:54.454 }, 00:17:54.454 { 00:17:54.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.454 "dma_device_type": 2 00:17:54.454 } 00:17:54.454 ], 00:17:54.454 "driver_specific": {} 00:17:54.454 } 00:17:54.454 ] 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.454 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.713 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.713 "name": "Existed_Raid", 00:17:54.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.713 "strip_size_kb": 64, 00:17:54.713 "state": "configuring", 00:17:54.713 "raid_level": "raid0", 00:17:54.713 "superblock": false, 00:17:54.713 "num_base_bdevs": 4, 00:17:54.713 "num_base_bdevs_discovered": 3, 00:17:54.713 "num_base_bdevs_operational": 4, 00:17:54.713 "base_bdevs_list": [ 00:17:54.713 { 00:17:54.713 "name": "BaseBdev1", 00:17:54.713 "uuid": "6fe07169-2920-4983-ba7a-e1db0f5216d9", 00:17:54.713 "is_configured": true, 00:17:54.713 "data_offset": 0, 00:17:54.713 "data_size": 65536 00:17:54.713 }, 00:17:54.713 { 00:17:54.713 "name": "BaseBdev2", 00:17:54.713 "uuid": "e2120f1f-69f2-4c17-a395-507f52556c68", 00:17:54.713 "is_configured": true, 00:17:54.713 "data_offset": 0, 00:17:54.713 "data_size": 65536 00:17:54.713 }, 00:17:54.713 { 00:17:54.713 "name": "BaseBdev3", 00:17:54.713 "uuid": "eb9dc7ce-8a9f-4487-ba6b-0cf92f1f8ee9", 00:17:54.713 "is_configured": true, 00:17:54.713 "data_offset": 0, 00:17:54.713 "data_size": 65536 00:17:54.713 }, 00:17:54.713 { 00:17:54.713 "name": "BaseBdev4", 00:17:54.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.713 "is_configured": false, 00:17:54.713 "data_offset": 0, 00:17:54.713 "data_size": 0 00:17:54.713 } 00:17:54.713 ] 00:17:54.713 }' 00:17:54.713 00:12:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.713 00:12:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.280 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:55.537 [2024-07-16 00:12:42.420186] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:55.537 [2024-07-16 00:12:42.420230] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x168e350 00:17:55.537 [2024-07-16 00:12:42.420239] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:55.537 [2024-07-16 00:12:42.420486] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x168e020 00:17:55.537 [2024-07-16 00:12:42.420608] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x168e350 00:17:55.537 [2024-07-16 00:12:42.420617] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x168e350 00:17:55.537 [2024-07-16 00:12:42.420780] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:55.537 BaseBdev4 00:17:55.537 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:55.537 00:12:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:55.537 00:12:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:55.537 00:12:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:55.537 00:12:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:55.537 00:12:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:55.537 00:12:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:55.795 00:12:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:56.054 [ 00:17:56.054 { 00:17:56.054 "name": "BaseBdev4", 00:17:56.054 "aliases": [ 00:17:56.054 "50181997-44ad-48e6-a9af-35a6eb0c7a0f" 00:17:56.054 ], 00:17:56.054 "product_name": "Malloc disk", 00:17:56.054 "block_size": 512, 00:17:56.054 "num_blocks": 65536, 00:17:56.054 "uuid": "50181997-44ad-48e6-a9af-35a6eb0c7a0f", 00:17:56.054 "assigned_rate_limits": { 00:17:56.054 "rw_ios_per_sec": 0, 00:17:56.054 "rw_mbytes_per_sec": 0, 00:17:56.054 "r_mbytes_per_sec": 0, 00:17:56.054 "w_mbytes_per_sec": 0 00:17:56.054 }, 00:17:56.054 "claimed": true, 00:17:56.054 "claim_type": "exclusive_write", 00:17:56.054 "zoned": false, 00:17:56.054 "supported_io_types": { 00:17:56.054 "read": true, 00:17:56.054 "write": true, 00:17:56.054 "unmap": true, 00:17:56.054 "flush": true, 00:17:56.054 "reset": true, 00:17:56.054 "nvme_admin": false, 00:17:56.054 "nvme_io": false, 00:17:56.054 "nvme_io_md": false, 00:17:56.054 "write_zeroes": true, 00:17:56.054 "zcopy": true, 00:17:56.054 "get_zone_info": false, 00:17:56.054 "zone_management": false, 00:17:56.054 "zone_append": false, 00:17:56.054 "compare": false, 00:17:56.054 "compare_and_write": false, 00:17:56.054 "abort": true, 00:17:56.054 "seek_hole": false, 00:17:56.054 "seek_data": false, 00:17:56.054 "copy": true, 00:17:56.054 "nvme_iov_md": false 00:17:56.054 }, 00:17:56.054 "memory_domains": [ 00:17:56.054 { 00:17:56.054 "dma_device_id": "system", 00:17:56.054 "dma_device_type": 1 00:17:56.054 }, 00:17:56.054 { 00:17:56.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.054 "dma_device_type": 2 00:17:56.054 } 00:17:56.054 ], 00:17:56.054 "driver_specific": {} 00:17:56.054 } 00:17:56.054 ] 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.054 00:12:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.336 00:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.336 "name": "Existed_Raid", 00:17:56.336 "uuid": "59a57470-c7c2-47ee-b22c-fb10dc97fc0d", 00:17:56.336 "strip_size_kb": 64, 00:17:56.336 "state": "online", 00:17:56.336 "raid_level": "raid0", 00:17:56.336 "superblock": false, 00:17:56.336 "num_base_bdevs": 4, 00:17:56.336 "num_base_bdevs_discovered": 4, 00:17:56.336 "num_base_bdevs_operational": 4, 00:17:56.336 "base_bdevs_list": [ 00:17:56.336 { 00:17:56.336 "name": "BaseBdev1", 00:17:56.336 "uuid": "6fe07169-2920-4983-ba7a-e1db0f5216d9", 00:17:56.336 "is_configured": true, 00:17:56.336 "data_offset": 0, 00:17:56.336 "data_size": 65536 00:17:56.336 }, 00:17:56.336 { 00:17:56.336 "name": "BaseBdev2", 00:17:56.336 "uuid": "e2120f1f-69f2-4c17-a395-507f52556c68", 00:17:56.336 "is_configured": true, 00:17:56.336 "data_offset": 0, 00:17:56.336 "data_size": 65536 00:17:56.336 }, 00:17:56.336 { 00:17:56.336 "name": "BaseBdev3", 00:17:56.336 "uuid": "eb9dc7ce-8a9f-4487-ba6b-0cf92f1f8ee9", 00:17:56.336 "is_configured": true, 00:17:56.336 "data_offset": 0, 00:17:56.336 "data_size": 65536 00:17:56.336 }, 00:17:56.336 { 00:17:56.336 "name": "BaseBdev4", 00:17:56.336 "uuid": "50181997-44ad-48e6-a9af-35a6eb0c7a0f", 00:17:56.336 "is_configured": true, 00:17:56.336 "data_offset": 0, 00:17:56.336 "data_size": 65536 00:17:56.336 } 00:17:56.336 ] 00:17:56.336 }' 00:17:56.336 00:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.336 00:12:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.902 00:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:56.902 00:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:56.902 00:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:56.902 00:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:56.902 00:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:56.902 00:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:56.902 00:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:56.902 00:12:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:57.162 [2024-07-16 00:12:43.984700] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:57.162 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:57.162 "name": "Existed_Raid", 00:17:57.162 "aliases": [ 00:17:57.162 "59a57470-c7c2-47ee-b22c-fb10dc97fc0d" 00:17:57.162 ], 00:17:57.162 "product_name": "Raid Volume", 00:17:57.162 "block_size": 512, 00:17:57.162 "num_blocks": 262144, 00:17:57.162 "uuid": "59a57470-c7c2-47ee-b22c-fb10dc97fc0d", 00:17:57.162 "assigned_rate_limits": { 00:17:57.162 "rw_ios_per_sec": 0, 00:17:57.162 "rw_mbytes_per_sec": 0, 00:17:57.162 "r_mbytes_per_sec": 0, 00:17:57.162 "w_mbytes_per_sec": 0 00:17:57.162 }, 00:17:57.162 "claimed": false, 00:17:57.162 "zoned": false, 00:17:57.162 "supported_io_types": { 00:17:57.162 "read": true, 00:17:57.162 "write": true, 00:17:57.162 "unmap": true, 00:17:57.162 "flush": true, 00:17:57.162 "reset": true, 00:17:57.162 "nvme_admin": false, 00:17:57.162 "nvme_io": false, 00:17:57.162 "nvme_io_md": false, 00:17:57.162 "write_zeroes": true, 00:17:57.162 "zcopy": false, 00:17:57.162 "get_zone_info": false, 00:17:57.162 "zone_management": false, 00:17:57.162 "zone_append": false, 00:17:57.162 "compare": false, 00:17:57.162 "compare_and_write": false, 00:17:57.162 "abort": false, 00:17:57.162 "seek_hole": false, 00:17:57.162 "seek_data": false, 00:17:57.162 "copy": false, 00:17:57.162 "nvme_iov_md": false 00:17:57.162 }, 00:17:57.162 "memory_domains": [ 00:17:57.162 { 00:17:57.162 "dma_device_id": "system", 00:17:57.162 "dma_device_type": 1 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.162 "dma_device_type": 2 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "dma_device_id": "system", 00:17:57.162 "dma_device_type": 1 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.162 "dma_device_type": 2 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "dma_device_id": "system", 00:17:57.162 "dma_device_type": 1 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.162 "dma_device_type": 2 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "dma_device_id": "system", 00:17:57.162 "dma_device_type": 1 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.162 "dma_device_type": 2 00:17:57.162 } 00:17:57.162 ], 00:17:57.162 "driver_specific": { 00:17:57.162 "raid": { 00:17:57.162 "uuid": "59a57470-c7c2-47ee-b22c-fb10dc97fc0d", 00:17:57.162 "strip_size_kb": 64, 00:17:57.162 "state": "online", 00:17:57.162 "raid_level": "raid0", 00:17:57.162 "superblock": false, 00:17:57.162 "num_base_bdevs": 4, 00:17:57.162 "num_base_bdevs_discovered": 4, 00:17:57.162 "num_base_bdevs_operational": 4, 00:17:57.162 "base_bdevs_list": [ 00:17:57.162 { 00:17:57.162 "name": "BaseBdev1", 00:17:57.162 "uuid": "6fe07169-2920-4983-ba7a-e1db0f5216d9", 00:17:57.162 "is_configured": true, 00:17:57.162 "data_offset": 0, 00:17:57.162 "data_size": 65536 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "name": "BaseBdev2", 00:17:57.162 "uuid": "e2120f1f-69f2-4c17-a395-507f52556c68", 00:17:57.162 "is_configured": true, 00:17:57.162 "data_offset": 0, 00:17:57.162 "data_size": 65536 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "name": "BaseBdev3", 00:17:57.162 "uuid": "eb9dc7ce-8a9f-4487-ba6b-0cf92f1f8ee9", 00:17:57.162 "is_configured": true, 00:17:57.162 "data_offset": 0, 00:17:57.162 "data_size": 65536 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "name": "BaseBdev4", 00:17:57.162 "uuid": "50181997-44ad-48e6-a9af-35a6eb0c7a0f", 00:17:57.162 "is_configured": true, 00:17:57.162 "data_offset": 0, 00:17:57.162 "data_size": 65536 00:17:57.162 } 00:17:57.162 ] 00:17:57.162 } 00:17:57.162 } 00:17:57.162 }' 00:17:57.162 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:57.162 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:57.162 BaseBdev2 00:17:57.162 BaseBdev3 00:17:57.162 BaseBdev4' 00:17:57.162 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:57.162 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:57.162 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:57.732 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:57.732 "name": "BaseBdev1", 00:17:57.732 "aliases": [ 00:17:57.732 "6fe07169-2920-4983-ba7a-e1db0f5216d9" 00:17:57.732 ], 00:17:57.732 "product_name": "Malloc disk", 00:17:57.732 "block_size": 512, 00:17:57.732 "num_blocks": 65536, 00:17:57.732 "uuid": "6fe07169-2920-4983-ba7a-e1db0f5216d9", 00:17:57.732 "assigned_rate_limits": { 00:17:57.732 "rw_ios_per_sec": 0, 00:17:57.732 "rw_mbytes_per_sec": 0, 00:17:57.732 "r_mbytes_per_sec": 0, 00:17:57.732 "w_mbytes_per_sec": 0 00:17:57.732 }, 00:17:57.732 "claimed": true, 00:17:57.732 "claim_type": "exclusive_write", 00:17:57.732 "zoned": false, 00:17:57.732 "supported_io_types": { 00:17:57.732 "read": true, 00:17:57.732 "write": true, 00:17:57.732 "unmap": true, 00:17:57.732 "flush": true, 00:17:57.732 "reset": true, 00:17:57.732 "nvme_admin": false, 00:17:57.732 "nvme_io": false, 00:17:57.732 "nvme_io_md": false, 00:17:57.732 "write_zeroes": true, 00:17:57.732 "zcopy": true, 00:17:57.732 "get_zone_info": false, 00:17:57.732 "zone_management": false, 00:17:57.732 "zone_append": false, 00:17:57.732 "compare": false, 00:17:57.732 "compare_and_write": false, 00:17:57.732 "abort": true, 00:17:57.732 "seek_hole": false, 00:17:57.732 "seek_data": false, 00:17:57.732 "copy": true, 00:17:57.732 "nvme_iov_md": false 00:17:57.732 }, 00:17:57.732 "memory_domains": [ 00:17:57.732 { 00:17:57.732 "dma_device_id": "system", 00:17:57.732 "dma_device_type": 1 00:17:57.732 }, 00:17:57.732 { 00:17:57.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.732 "dma_device_type": 2 00:17:57.732 } 00:17:57.732 ], 00:17:57.732 "driver_specific": {} 00:17:57.732 }' 00:17:57.732 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.732 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.992 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:57.992 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.992 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.992 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:57.992 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.992 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.992 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:57.992 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.251 00:12:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.251 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:58.251 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:58.251 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:58.251 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:58.820 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:58.820 "name": "BaseBdev2", 00:17:58.820 "aliases": [ 00:17:58.820 "e2120f1f-69f2-4c17-a395-507f52556c68" 00:17:58.820 ], 00:17:58.820 "product_name": "Malloc disk", 00:17:58.820 "block_size": 512, 00:17:58.820 "num_blocks": 65536, 00:17:58.820 "uuid": "e2120f1f-69f2-4c17-a395-507f52556c68", 00:17:58.820 "assigned_rate_limits": { 00:17:58.820 "rw_ios_per_sec": 0, 00:17:58.820 "rw_mbytes_per_sec": 0, 00:17:58.820 "r_mbytes_per_sec": 0, 00:17:58.820 "w_mbytes_per_sec": 0 00:17:58.820 }, 00:17:58.820 "claimed": true, 00:17:58.820 "claim_type": "exclusive_write", 00:17:58.820 "zoned": false, 00:17:58.820 "supported_io_types": { 00:17:58.820 "read": true, 00:17:58.820 "write": true, 00:17:58.820 "unmap": true, 00:17:58.820 "flush": true, 00:17:58.820 "reset": true, 00:17:58.820 "nvme_admin": false, 00:17:58.820 "nvme_io": false, 00:17:58.820 "nvme_io_md": false, 00:17:58.820 "write_zeroes": true, 00:17:58.820 "zcopy": true, 00:17:58.820 "get_zone_info": false, 00:17:58.820 "zone_management": false, 00:17:58.820 "zone_append": false, 00:17:58.820 "compare": false, 00:17:58.820 "compare_and_write": false, 00:17:58.820 "abort": true, 00:17:58.820 "seek_hole": false, 00:17:58.820 "seek_data": false, 00:17:58.820 "copy": true, 00:17:58.820 "nvme_iov_md": false 00:17:58.820 }, 00:17:58.820 "memory_domains": [ 00:17:58.820 { 00:17:58.820 "dma_device_id": "system", 00:17:58.820 "dma_device_type": 1 00:17:58.820 }, 00:17:58.820 { 00:17:58.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.820 "dma_device_type": 2 00:17:58.820 } 00:17:58.820 ], 00:17:58.820 "driver_specific": {} 00:17:58.820 }' 00:17:58.820 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.820 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.820 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:58.820 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.820 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.820 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:58.820 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.820 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.078 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:59.078 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.078 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.079 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:59.079 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:59.079 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:59.079 00:12:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:59.337 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:59.337 "name": "BaseBdev3", 00:17:59.337 "aliases": [ 00:17:59.337 "eb9dc7ce-8a9f-4487-ba6b-0cf92f1f8ee9" 00:17:59.337 ], 00:17:59.337 "product_name": "Malloc disk", 00:17:59.337 "block_size": 512, 00:17:59.337 "num_blocks": 65536, 00:17:59.337 "uuid": "eb9dc7ce-8a9f-4487-ba6b-0cf92f1f8ee9", 00:17:59.337 "assigned_rate_limits": { 00:17:59.338 "rw_ios_per_sec": 0, 00:17:59.338 "rw_mbytes_per_sec": 0, 00:17:59.338 "r_mbytes_per_sec": 0, 00:17:59.338 "w_mbytes_per_sec": 0 00:17:59.338 }, 00:17:59.338 "claimed": true, 00:17:59.338 "claim_type": "exclusive_write", 00:17:59.338 "zoned": false, 00:17:59.338 "supported_io_types": { 00:17:59.338 "read": true, 00:17:59.338 "write": true, 00:17:59.338 "unmap": true, 00:17:59.338 "flush": true, 00:17:59.338 "reset": true, 00:17:59.338 "nvme_admin": false, 00:17:59.338 "nvme_io": false, 00:17:59.338 "nvme_io_md": false, 00:17:59.338 "write_zeroes": true, 00:17:59.338 "zcopy": true, 00:17:59.338 "get_zone_info": false, 00:17:59.338 "zone_management": false, 00:17:59.338 "zone_append": false, 00:17:59.338 "compare": false, 00:17:59.338 "compare_and_write": false, 00:17:59.338 "abort": true, 00:17:59.338 "seek_hole": false, 00:17:59.338 "seek_data": false, 00:17:59.338 "copy": true, 00:17:59.338 "nvme_iov_md": false 00:17:59.338 }, 00:17:59.338 "memory_domains": [ 00:17:59.338 { 00:17:59.338 "dma_device_id": "system", 00:17:59.338 "dma_device_type": 1 00:17:59.338 }, 00:17:59.338 { 00:17:59.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.338 "dma_device_type": 2 00:17:59.338 } 00:17:59.338 ], 00:17:59.338 "driver_specific": {} 00:17:59.338 }' 00:17:59.338 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.338 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.338 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:59.338 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.338 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.597 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:59.597 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.597 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.597 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:59.597 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.597 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.597 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:59.597 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:59.597 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:59.597 00:12:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:00.165 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:00.165 "name": "BaseBdev4", 00:18:00.165 "aliases": [ 00:18:00.165 "50181997-44ad-48e6-a9af-35a6eb0c7a0f" 00:18:00.165 ], 00:18:00.165 "product_name": "Malloc disk", 00:18:00.165 "block_size": 512, 00:18:00.165 "num_blocks": 65536, 00:18:00.165 "uuid": "50181997-44ad-48e6-a9af-35a6eb0c7a0f", 00:18:00.165 "assigned_rate_limits": { 00:18:00.165 "rw_ios_per_sec": 0, 00:18:00.165 "rw_mbytes_per_sec": 0, 00:18:00.165 "r_mbytes_per_sec": 0, 00:18:00.165 "w_mbytes_per_sec": 0 00:18:00.165 }, 00:18:00.165 "claimed": true, 00:18:00.166 "claim_type": "exclusive_write", 00:18:00.166 "zoned": false, 00:18:00.166 "supported_io_types": { 00:18:00.166 "read": true, 00:18:00.166 "write": true, 00:18:00.166 "unmap": true, 00:18:00.166 "flush": true, 00:18:00.166 "reset": true, 00:18:00.166 "nvme_admin": false, 00:18:00.166 "nvme_io": false, 00:18:00.166 "nvme_io_md": false, 00:18:00.166 "write_zeroes": true, 00:18:00.166 "zcopy": true, 00:18:00.166 "get_zone_info": false, 00:18:00.166 "zone_management": false, 00:18:00.166 "zone_append": false, 00:18:00.166 "compare": false, 00:18:00.166 "compare_and_write": false, 00:18:00.166 "abort": true, 00:18:00.166 "seek_hole": false, 00:18:00.166 "seek_data": false, 00:18:00.166 "copy": true, 00:18:00.166 "nvme_iov_md": false 00:18:00.166 }, 00:18:00.166 "memory_domains": [ 00:18:00.166 { 00:18:00.166 "dma_device_id": "system", 00:18:00.166 "dma_device_type": 1 00:18:00.166 }, 00:18:00.166 { 00:18:00.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.166 "dma_device_type": 2 00:18:00.166 } 00:18:00.166 ], 00:18:00.166 "driver_specific": {} 00:18:00.166 }' 00:18:00.166 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.166 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.425 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:00.425 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.425 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.425 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:00.425 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.425 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.425 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:00.425 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.425 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.684 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:00.684 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:00.684 [2024-07-16 00:12:47.606037] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:00.684 [2024-07-16 00:12:47.606065] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:00.684 [2024-07-16 00:12:47.606108] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.685 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.944 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.944 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.944 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.944 "name": "Existed_Raid", 00:18:00.944 "uuid": "59a57470-c7c2-47ee-b22c-fb10dc97fc0d", 00:18:00.944 "strip_size_kb": 64, 00:18:00.944 "state": "offline", 00:18:00.944 "raid_level": "raid0", 00:18:00.944 "superblock": false, 00:18:00.944 "num_base_bdevs": 4, 00:18:00.944 "num_base_bdevs_discovered": 3, 00:18:00.944 "num_base_bdevs_operational": 3, 00:18:00.944 "base_bdevs_list": [ 00:18:00.944 { 00:18:00.944 "name": null, 00:18:00.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.944 "is_configured": false, 00:18:00.944 "data_offset": 0, 00:18:00.944 "data_size": 65536 00:18:00.944 }, 00:18:00.944 { 00:18:00.944 "name": "BaseBdev2", 00:18:00.944 "uuid": "e2120f1f-69f2-4c17-a395-507f52556c68", 00:18:00.944 "is_configured": true, 00:18:00.944 "data_offset": 0, 00:18:00.944 "data_size": 65536 00:18:00.944 }, 00:18:00.944 { 00:18:00.944 "name": "BaseBdev3", 00:18:00.944 "uuid": "eb9dc7ce-8a9f-4487-ba6b-0cf92f1f8ee9", 00:18:00.944 "is_configured": true, 00:18:00.944 "data_offset": 0, 00:18:00.944 "data_size": 65536 00:18:00.944 }, 00:18:00.944 { 00:18:00.944 "name": "BaseBdev4", 00:18:00.944 "uuid": "50181997-44ad-48e6-a9af-35a6eb0c7a0f", 00:18:00.944 "is_configured": true, 00:18:00.944 "data_offset": 0, 00:18:00.944 "data_size": 65536 00:18:00.944 } 00:18:00.944 ] 00:18:00.944 }' 00:18:00.944 00:12:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.944 00:12:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:01.882 00:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:01.882 00:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:01.882 00:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.882 00:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:01.882 00:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:01.882 00:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:01.882 00:12:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:02.451 [2024-07-16 00:12:49.143136] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:02.451 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:02.451 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:02.451 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.451 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:02.711 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:02.711 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:02.711 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:02.970 [2024-07-16 00:12:49.913334] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:03.229 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:03.229 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:03.229 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.229 00:12:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:03.489 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:03.489 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:03.489 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:03.489 [2024-07-16 00:12:50.421404] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:03.489 [2024-07-16 00:12:50.421445] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x168e350 name Existed_Raid, state offline 00:18:03.749 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:03.749 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:03.749 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.749 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:03.749 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:04.009 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:04.009 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:04.009 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:04.009 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:04.009 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:04.009 BaseBdev2 00:18:04.270 00:12:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:04.270 00:12:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:04.270 00:12:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:04.270 00:12:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:04.270 00:12:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:04.270 00:12:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:04.270 00:12:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:04.270 00:12:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:04.595 [ 00:18:04.595 { 00:18:04.595 "name": "BaseBdev2", 00:18:04.595 "aliases": [ 00:18:04.595 "fa0ac1b8-97fe-4b15-a5df-f395c89c083c" 00:18:04.595 ], 00:18:04.595 "product_name": "Malloc disk", 00:18:04.595 "block_size": 512, 00:18:04.595 "num_blocks": 65536, 00:18:04.595 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:04.595 "assigned_rate_limits": { 00:18:04.595 "rw_ios_per_sec": 0, 00:18:04.595 "rw_mbytes_per_sec": 0, 00:18:04.595 "r_mbytes_per_sec": 0, 00:18:04.595 "w_mbytes_per_sec": 0 00:18:04.595 }, 00:18:04.595 "claimed": false, 00:18:04.595 "zoned": false, 00:18:04.595 "supported_io_types": { 00:18:04.595 "read": true, 00:18:04.595 "write": true, 00:18:04.595 "unmap": true, 00:18:04.595 "flush": true, 00:18:04.595 "reset": true, 00:18:04.595 "nvme_admin": false, 00:18:04.595 "nvme_io": false, 00:18:04.595 "nvme_io_md": false, 00:18:04.595 "write_zeroes": true, 00:18:04.595 "zcopy": true, 00:18:04.595 "get_zone_info": false, 00:18:04.595 "zone_management": false, 00:18:04.595 "zone_append": false, 00:18:04.595 "compare": false, 00:18:04.595 "compare_and_write": false, 00:18:04.595 "abort": true, 00:18:04.595 "seek_hole": false, 00:18:04.595 "seek_data": false, 00:18:04.595 "copy": true, 00:18:04.595 "nvme_iov_md": false 00:18:04.595 }, 00:18:04.595 "memory_domains": [ 00:18:04.595 { 00:18:04.595 "dma_device_id": "system", 00:18:04.595 "dma_device_type": 1 00:18:04.595 }, 00:18:04.595 { 00:18:04.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.595 "dma_device_type": 2 00:18:04.595 } 00:18:04.595 ], 00:18:04.595 "driver_specific": {} 00:18:04.595 } 00:18:04.595 ] 00:18:04.595 00:12:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:04.595 00:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:04.595 00:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:04.595 00:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:04.855 BaseBdev3 00:18:04.855 00:12:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:04.855 00:12:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:04.855 00:12:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:04.855 00:12:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:04.855 00:12:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:04.855 00:12:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:04.855 00:12:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:05.115 00:12:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:05.374 [ 00:18:05.374 { 00:18:05.375 "name": "BaseBdev3", 00:18:05.375 "aliases": [ 00:18:05.375 "60608e22-306d-424e-94c9-bf38811e95c0" 00:18:05.375 ], 00:18:05.375 "product_name": "Malloc disk", 00:18:05.375 "block_size": 512, 00:18:05.375 "num_blocks": 65536, 00:18:05.375 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:05.375 "assigned_rate_limits": { 00:18:05.375 "rw_ios_per_sec": 0, 00:18:05.375 "rw_mbytes_per_sec": 0, 00:18:05.375 "r_mbytes_per_sec": 0, 00:18:05.375 "w_mbytes_per_sec": 0 00:18:05.375 }, 00:18:05.375 "claimed": false, 00:18:05.375 "zoned": false, 00:18:05.375 "supported_io_types": { 00:18:05.375 "read": true, 00:18:05.375 "write": true, 00:18:05.375 "unmap": true, 00:18:05.375 "flush": true, 00:18:05.375 "reset": true, 00:18:05.375 "nvme_admin": false, 00:18:05.375 "nvme_io": false, 00:18:05.375 "nvme_io_md": false, 00:18:05.375 "write_zeroes": true, 00:18:05.375 "zcopy": true, 00:18:05.375 "get_zone_info": false, 00:18:05.375 "zone_management": false, 00:18:05.375 "zone_append": false, 00:18:05.375 "compare": false, 00:18:05.375 "compare_and_write": false, 00:18:05.375 "abort": true, 00:18:05.375 "seek_hole": false, 00:18:05.375 "seek_data": false, 00:18:05.375 "copy": true, 00:18:05.375 "nvme_iov_md": false 00:18:05.375 }, 00:18:05.375 "memory_domains": [ 00:18:05.375 { 00:18:05.375 "dma_device_id": "system", 00:18:05.375 "dma_device_type": 1 00:18:05.375 }, 00:18:05.375 { 00:18:05.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.375 "dma_device_type": 2 00:18:05.375 } 00:18:05.375 ], 00:18:05.375 "driver_specific": {} 00:18:05.375 } 00:18:05.375 ] 00:18:05.375 00:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:05.375 00:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:05.375 00:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:05.375 00:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:05.635 BaseBdev4 00:18:05.635 00:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:05.635 00:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:05.635 00:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:05.635 00:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:05.635 00:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:05.635 00:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:05.635 00:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:05.895 00:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:06.154 [ 00:18:06.154 { 00:18:06.154 "name": "BaseBdev4", 00:18:06.154 "aliases": [ 00:18:06.154 "8283a67f-6a86-4325-9294-40807898d2d6" 00:18:06.154 ], 00:18:06.154 "product_name": "Malloc disk", 00:18:06.154 "block_size": 512, 00:18:06.154 "num_blocks": 65536, 00:18:06.154 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:06.154 "assigned_rate_limits": { 00:18:06.154 "rw_ios_per_sec": 0, 00:18:06.154 "rw_mbytes_per_sec": 0, 00:18:06.154 "r_mbytes_per_sec": 0, 00:18:06.154 "w_mbytes_per_sec": 0 00:18:06.154 }, 00:18:06.154 "claimed": false, 00:18:06.154 "zoned": false, 00:18:06.154 "supported_io_types": { 00:18:06.154 "read": true, 00:18:06.154 "write": true, 00:18:06.154 "unmap": true, 00:18:06.154 "flush": true, 00:18:06.154 "reset": true, 00:18:06.154 "nvme_admin": false, 00:18:06.154 "nvme_io": false, 00:18:06.154 "nvme_io_md": false, 00:18:06.154 "write_zeroes": true, 00:18:06.154 "zcopy": true, 00:18:06.154 "get_zone_info": false, 00:18:06.154 "zone_management": false, 00:18:06.154 "zone_append": false, 00:18:06.154 "compare": false, 00:18:06.154 "compare_and_write": false, 00:18:06.154 "abort": true, 00:18:06.154 "seek_hole": false, 00:18:06.154 "seek_data": false, 00:18:06.154 "copy": true, 00:18:06.154 "nvme_iov_md": false 00:18:06.154 }, 00:18:06.154 "memory_domains": [ 00:18:06.154 { 00:18:06.154 "dma_device_id": "system", 00:18:06.154 "dma_device_type": 1 00:18:06.154 }, 00:18:06.154 { 00:18:06.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.154 "dma_device_type": 2 00:18:06.154 } 00:18:06.154 ], 00:18:06.154 "driver_specific": {} 00:18:06.154 } 00:18:06.154 ] 00:18:06.154 00:12:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:06.154 00:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:06.154 00:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:06.154 00:12:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:06.423 [2024-07-16 00:12:53.135402] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:06.423 [2024-07-16 00:12:53.135443] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:06.423 [2024-07-16 00:12:53.135461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:06.423 [2024-07-16 00:12:53.136782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:06.423 [2024-07-16 00:12:53.136821] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.423 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:06.683 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.683 "name": "Existed_Raid", 00:18:06.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.683 "strip_size_kb": 64, 00:18:06.683 "state": "configuring", 00:18:06.683 "raid_level": "raid0", 00:18:06.683 "superblock": false, 00:18:06.683 "num_base_bdevs": 4, 00:18:06.683 "num_base_bdevs_discovered": 3, 00:18:06.683 "num_base_bdevs_operational": 4, 00:18:06.683 "base_bdevs_list": [ 00:18:06.683 { 00:18:06.683 "name": "BaseBdev1", 00:18:06.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.683 "is_configured": false, 00:18:06.683 "data_offset": 0, 00:18:06.683 "data_size": 0 00:18:06.683 }, 00:18:06.683 { 00:18:06.683 "name": "BaseBdev2", 00:18:06.683 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:06.683 "is_configured": true, 00:18:06.683 "data_offset": 0, 00:18:06.683 "data_size": 65536 00:18:06.683 }, 00:18:06.683 { 00:18:06.683 "name": "BaseBdev3", 00:18:06.683 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:06.683 "is_configured": true, 00:18:06.683 "data_offset": 0, 00:18:06.683 "data_size": 65536 00:18:06.683 }, 00:18:06.683 { 00:18:06.683 "name": "BaseBdev4", 00:18:06.683 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:06.683 "is_configured": true, 00:18:06.683 "data_offset": 0, 00:18:06.683 "data_size": 65536 00:18:06.683 } 00:18:06.683 ] 00:18:06.683 }' 00:18:06.683 00:12:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.683 00:12:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.250 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:07.508 [2024-07-16 00:12:54.234306] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:07.508 "name": "Existed_Raid", 00:18:07.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.508 "strip_size_kb": 64, 00:18:07.508 "state": "configuring", 00:18:07.508 "raid_level": "raid0", 00:18:07.508 "superblock": false, 00:18:07.508 "num_base_bdevs": 4, 00:18:07.508 "num_base_bdevs_discovered": 2, 00:18:07.508 "num_base_bdevs_operational": 4, 00:18:07.508 "base_bdevs_list": [ 00:18:07.508 { 00:18:07.508 "name": "BaseBdev1", 00:18:07.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.508 "is_configured": false, 00:18:07.508 "data_offset": 0, 00:18:07.508 "data_size": 0 00:18:07.508 }, 00:18:07.508 { 00:18:07.508 "name": null, 00:18:07.508 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:07.508 "is_configured": false, 00:18:07.508 "data_offset": 0, 00:18:07.508 "data_size": 65536 00:18:07.508 }, 00:18:07.508 { 00:18:07.508 "name": "BaseBdev3", 00:18:07.508 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:07.508 "is_configured": true, 00:18:07.508 "data_offset": 0, 00:18:07.508 "data_size": 65536 00:18:07.508 }, 00:18:07.508 { 00:18:07.508 "name": "BaseBdev4", 00:18:07.508 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:07.508 "is_configured": true, 00:18:07.508 "data_offset": 0, 00:18:07.508 "data_size": 65536 00:18:07.508 } 00:18:07.508 ] 00:18:07.508 }' 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:07.508 00:12:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:08.446 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.446 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:08.446 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:08.446 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:08.705 [2024-07-16 00:12:55.454077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:08.705 BaseBdev1 00:18:08.705 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:08.705 00:12:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:08.705 00:12:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:08.705 00:12:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:08.705 00:12:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:08.705 00:12:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:08.705 00:12:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:08.964 00:12:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:09.223 [ 00:18:09.223 { 00:18:09.223 "name": "BaseBdev1", 00:18:09.223 "aliases": [ 00:18:09.223 "331ee72d-42bc-44a4-972b-5ee6e60292a3" 00:18:09.223 ], 00:18:09.223 "product_name": "Malloc disk", 00:18:09.223 "block_size": 512, 00:18:09.223 "num_blocks": 65536, 00:18:09.223 "uuid": "331ee72d-42bc-44a4-972b-5ee6e60292a3", 00:18:09.223 "assigned_rate_limits": { 00:18:09.223 "rw_ios_per_sec": 0, 00:18:09.223 "rw_mbytes_per_sec": 0, 00:18:09.223 "r_mbytes_per_sec": 0, 00:18:09.223 "w_mbytes_per_sec": 0 00:18:09.223 }, 00:18:09.223 "claimed": true, 00:18:09.223 "claim_type": "exclusive_write", 00:18:09.223 "zoned": false, 00:18:09.223 "supported_io_types": { 00:18:09.223 "read": true, 00:18:09.223 "write": true, 00:18:09.223 "unmap": true, 00:18:09.223 "flush": true, 00:18:09.223 "reset": true, 00:18:09.223 "nvme_admin": false, 00:18:09.223 "nvme_io": false, 00:18:09.223 "nvme_io_md": false, 00:18:09.223 "write_zeroes": true, 00:18:09.223 "zcopy": true, 00:18:09.223 "get_zone_info": false, 00:18:09.223 "zone_management": false, 00:18:09.223 "zone_append": false, 00:18:09.223 "compare": false, 00:18:09.223 "compare_and_write": false, 00:18:09.223 "abort": true, 00:18:09.223 "seek_hole": false, 00:18:09.223 "seek_data": false, 00:18:09.223 "copy": true, 00:18:09.223 "nvme_iov_md": false 00:18:09.223 }, 00:18:09.223 "memory_domains": [ 00:18:09.223 { 00:18:09.223 "dma_device_id": "system", 00:18:09.223 "dma_device_type": 1 00:18:09.223 }, 00:18:09.223 { 00:18:09.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.223 "dma_device_type": 2 00:18:09.223 } 00:18:09.223 ], 00:18:09.223 "driver_specific": {} 00:18:09.223 } 00:18:09.223 ] 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.223 00:12:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:09.482 00:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.482 "name": "Existed_Raid", 00:18:09.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.482 "strip_size_kb": 64, 00:18:09.482 "state": "configuring", 00:18:09.482 "raid_level": "raid0", 00:18:09.482 "superblock": false, 00:18:09.482 "num_base_bdevs": 4, 00:18:09.482 "num_base_bdevs_discovered": 3, 00:18:09.482 "num_base_bdevs_operational": 4, 00:18:09.482 "base_bdevs_list": [ 00:18:09.482 { 00:18:09.482 "name": "BaseBdev1", 00:18:09.482 "uuid": "331ee72d-42bc-44a4-972b-5ee6e60292a3", 00:18:09.483 "is_configured": true, 00:18:09.483 "data_offset": 0, 00:18:09.483 "data_size": 65536 00:18:09.483 }, 00:18:09.483 { 00:18:09.483 "name": null, 00:18:09.483 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:09.483 "is_configured": false, 00:18:09.483 "data_offset": 0, 00:18:09.483 "data_size": 65536 00:18:09.483 }, 00:18:09.483 { 00:18:09.483 "name": "BaseBdev3", 00:18:09.483 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:09.483 "is_configured": true, 00:18:09.483 "data_offset": 0, 00:18:09.483 "data_size": 65536 00:18:09.483 }, 00:18:09.483 { 00:18:09.483 "name": "BaseBdev4", 00:18:09.483 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:09.483 "is_configured": true, 00:18:09.483 "data_offset": 0, 00:18:09.483 "data_size": 65536 00:18:09.483 } 00:18:09.483 ] 00:18:09.483 }' 00:18:09.483 00:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.483 00:12:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:10.051 00:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.051 00:12:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:10.310 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:10.310 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:10.310 [2024-07-16 00:12:57.254878] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:10.569 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:10.569 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:10.569 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:10.569 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:10.569 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:10.569 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:10.569 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.569 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.569 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.569 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.570 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.570 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.829 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.829 "name": "Existed_Raid", 00:18:10.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.829 "strip_size_kb": 64, 00:18:10.829 "state": "configuring", 00:18:10.829 "raid_level": "raid0", 00:18:10.829 "superblock": false, 00:18:10.829 "num_base_bdevs": 4, 00:18:10.829 "num_base_bdevs_discovered": 2, 00:18:10.829 "num_base_bdevs_operational": 4, 00:18:10.829 "base_bdevs_list": [ 00:18:10.829 { 00:18:10.829 "name": "BaseBdev1", 00:18:10.829 "uuid": "331ee72d-42bc-44a4-972b-5ee6e60292a3", 00:18:10.829 "is_configured": true, 00:18:10.829 "data_offset": 0, 00:18:10.829 "data_size": 65536 00:18:10.829 }, 00:18:10.829 { 00:18:10.829 "name": null, 00:18:10.829 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:10.829 "is_configured": false, 00:18:10.829 "data_offset": 0, 00:18:10.829 "data_size": 65536 00:18:10.829 }, 00:18:10.829 { 00:18:10.829 "name": null, 00:18:10.829 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:10.829 "is_configured": false, 00:18:10.829 "data_offset": 0, 00:18:10.829 "data_size": 65536 00:18:10.829 }, 00:18:10.829 { 00:18:10.829 "name": "BaseBdev4", 00:18:10.829 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:10.829 "is_configured": true, 00:18:10.829 "data_offset": 0, 00:18:10.829 "data_size": 65536 00:18:10.829 } 00:18:10.829 ] 00:18:10.829 }' 00:18:10.829 00:12:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.829 00:12:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.397 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.397 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:11.397 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:11.397 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:11.656 [2024-07-16 00:12:58.494191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.656 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.914 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.914 "name": "Existed_Raid", 00:18:11.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:11.914 "strip_size_kb": 64, 00:18:11.914 "state": "configuring", 00:18:11.914 "raid_level": "raid0", 00:18:11.914 "superblock": false, 00:18:11.914 "num_base_bdevs": 4, 00:18:11.914 "num_base_bdevs_discovered": 3, 00:18:11.914 "num_base_bdevs_operational": 4, 00:18:11.914 "base_bdevs_list": [ 00:18:11.914 { 00:18:11.914 "name": "BaseBdev1", 00:18:11.914 "uuid": "331ee72d-42bc-44a4-972b-5ee6e60292a3", 00:18:11.914 "is_configured": true, 00:18:11.914 "data_offset": 0, 00:18:11.914 "data_size": 65536 00:18:11.914 }, 00:18:11.914 { 00:18:11.914 "name": null, 00:18:11.914 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:11.914 "is_configured": false, 00:18:11.914 "data_offset": 0, 00:18:11.914 "data_size": 65536 00:18:11.914 }, 00:18:11.914 { 00:18:11.914 "name": "BaseBdev3", 00:18:11.914 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:11.914 "is_configured": true, 00:18:11.914 "data_offset": 0, 00:18:11.914 "data_size": 65536 00:18:11.914 }, 00:18:11.914 { 00:18:11.914 "name": "BaseBdev4", 00:18:11.914 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:11.914 "is_configured": true, 00:18:11.914 "data_offset": 0, 00:18:11.914 "data_size": 65536 00:18:11.914 } 00:18:11.914 ] 00:18:11.914 }' 00:18:11.914 00:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.914 00:12:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:12.480 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.480 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:12.740 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:12.740 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:12.998 [2024-07-16 00:12:59.861833] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:12.998 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:12.998 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:12.999 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:12.999 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:12.999 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:12.999 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:12.999 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.999 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.999 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.999 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.999 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.999 00:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.256 00:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.256 "name": "Existed_Raid", 00:18:13.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.256 "strip_size_kb": 64, 00:18:13.256 "state": "configuring", 00:18:13.256 "raid_level": "raid0", 00:18:13.256 "superblock": false, 00:18:13.256 "num_base_bdevs": 4, 00:18:13.256 "num_base_bdevs_discovered": 2, 00:18:13.256 "num_base_bdevs_operational": 4, 00:18:13.256 "base_bdevs_list": [ 00:18:13.256 { 00:18:13.256 "name": null, 00:18:13.256 "uuid": "331ee72d-42bc-44a4-972b-5ee6e60292a3", 00:18:13.256 "is_configured": false, 00:18:13.256 "data_offset": 0, 00:18:13.256 "data_size": 65536 00:18:13.256 }, 00:18:13.256 { 00:18:13.256 "name": null, 00:18:13.256 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:13.256 "is_configured": false, 00:18:13.256 "data_offset": 0, 00:18:13.256 "data_size": 65536 00:18:13.256 }, 00:18:13.256 { 00:18:13.256 "name": "BaseBdev3", 00:18:13.256 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:13.256 "is_configured": true, 00:18:13.256 "data_offset": 0, 00:18:13.256 "data_size": 65536 00:18:13.256 }, 00:18:13.256 { 00:18:13.256 "name": "BaseBdev4", 00:18:13.256 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:13.256 "is_configured": true, 00:18:13.256 "data_offset": 0, 00:18:13.256 "data_size": 65536 00:18:13.256 } 00:18:13.256 ] 00:18:13.256 }' 00:18:13.256 00:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.256 00:13:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:14.191 00:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.191 00:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:14.191 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:14.191 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:14.449 [2024-07-16 00:13:01.281695] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.449 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:14.760 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:14.760 "name": "Existed_Raid", 00:18:14.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:14.760 "strip_size_kb": 64, 00:18:14.760 "state": "configuring", 00:18:14.760 "raid_level": "raid0", 00:18:14.760 "superblock": false, 00:18:14.760 "num_base_bdevs": 4, 00:18:14.760 "num_base_bdevs_discovered": 3, 00:18:14.760 "num_base_bdevs_operational": 4, 00:18:14.760 "base_bdevs_list": [ 00:18:14.760 { 00:18:14.760 "name": null, 00:18:14.760 "uuid": "331ee72d-42bc-44a4-972b-5ee6e60292a3", 00:18:14.760 "is_configured": false, 00:18:14.760 "data_offset": 0, 00:18:14.760 "data_size": 65536 00:18:14.760 }, 00:18:14.760 { 00:18:14.760 "name": "BaseBdev2", 00:18:14.760 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:14.760 "is_configured": true, 00:18:14.760 "data_offset": 0, 00:18:14.760 "data_size": 65536 00:18:14.760 }, 00:18:14.760 { 00:18:14.760 "name": "BaseBdev3", 00:18:14.760 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:14.760 "is_configured": true, 00:18:14.760 "data_offset": 0, 00:18:14.760 "data_size": 65536 00:18:14.760 }, 00:18:14.760 { 00:18:14.760 "name": "BaseBdev4", 00:18:14.760 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:14.760 "is_configured": true, 00:18:14.760 "data_offset": 0, 00:18:14.760 "data_size": 65536 00:18:14.760 } 00:18:14.760 ] 00:18:14.760 }' 00:18:14.760 00:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:14.760 00:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:15.326 00:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.326 00:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:15.584 00:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:15.585 00:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.585 00:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:15.842 00:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 331ee72d-42bc-44a4-972b-5ee6e60292a3 00:18:16.101 [2024-07-16 00:13:02.878449] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:16.101 [2024-07-16 00:13:02.878484] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1692040 00:18:16.101 [2024-07-16 00:13:02.878492] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:16.101 [2024-07-16 00:13:02.878686] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x168da70 00:18:16.101 [2024-07-16 00:13:02.878799] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1692040 00:18:16.101 [2024-07-16 00:13:02.878809] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1692040 00:18:16.101 [2024-07-16 00:13:02.878978] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:16.101 NewBaseBdev 00:18:16.101 00:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:16.101 00:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:16.101 00:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:16.101 00:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:16.101 00:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:16.101 00:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:16.101 00:13:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:16.359 00:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:16.618 [ 00:18:16.618 { 00:18:16.618 "name": "NewBaseBdev", 00:18:16.618 "aliases": [ 00:18:16.618 "331ee72d-42bc-44a4-972b-5ee6e60292a3" 00:18:16.618 ], 00:18:16.618 "product_name": "Malloc disk", 00:18:16.618 "block_size": 512, 00:18:16.618 "num_blocks": 65536, 00:18:16.618 "uuid": "331ee72d-42bc-44a4-972b-5ee6e60292a3", 00:18:16.618 "assigned_rate_limits": { 00:18:16.618 "rw_ios_per_sec": 0, 00:18:16.618 "rw_mbytes_per_sec": 0, 00:18:16.618 "r_mbytes_per_sec": 0, 00:18:16.618 "w_mbytes_per_sec": 0 00:18:16.618 }, 00:18:16.618 "claimed": true, 00:18:16.618 "claim_type": "exclusive_write", 00:18:16.618 "zoned": false, 00:18:16.618 "supported_io_types": { 00:18:16.618 "read": true, 00:18:16.618 "write": true, 00:18:16.618 "unmap": true, 00:18:16.618 "flush": true, 00:18:16.618 "reset": true, 00:18:16.618 "nvme_admin": false, 00:18:16.618 "nvme_io": false, 00:18:16.618 "nvme_io_md": false, 00:18:16.618 "write_zeroes": true, 00:18:16.618 "zcopy": true, 00:18:16.618 "get_zone_info": false, 00:18:16.618 "zone_management": false, 00:18:16.618 "zone_append": false, 00:18:16.618 "compare": false, 00:18:16.618 "compare_and_write": false, 00:18:16.618 "abort": true, 00:18:16.618 "seek_hole": false, 00:18:16.618 "seek_data": false, 00:18:16.618 "copy": true, 00:18:16.618 "nvme_iov_md": false 00:18:16.618 }, 00:18:16.618 "memory_domains": [ 00:18:16.618 { 00:18:16.618 "dma_device_id": "system", 00:18:16.618 "dma_device_type": 1 00:18:16.618 }, 00:18:16.618 { 00:18:16.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.618 "dma_device_type": 2 00:18:16.618 } 00:18:16.618 ], 00:18:16.618 "driver_specific": {} 00:18:16.618 } 00:18:16.618 ] 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.618 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:16.875 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:16.875 "name": "Existed_Raid", 00:18:16.875 "uuid": "0abe50bb-c629-4386-90d3-fe14fe8ff615", 00:18:16.875 "strip_size_kb": 64, 00:18:16.875 "state": "online", 00:18:16.875 "raid_level": "raid0", 00:18:16.875 "superblock": false, 00:18:16.875 "num_base_bdevs": 4, 00:18:16.875 "num_base_bdevs_discovered": 4, 00:18:16.875 "num_base_bdevs_operational": 4, 00:18:16.875 "base_bdevs_list": [ 00:18:16.875 { 00:18:16.875 "name": "NewBaseBdev", 00:18:16.875 "uuid": "331ee72d-42bc-44a4-972b-5ee6e60292a3", 00:18:16.875 "is_configured": true, 00:18:16.875 "data_offset": 0, 00:18:16.875 "data_size": 65536 00:18:16.875 }, 00:18:16.875 { 00:18:16.875 "name": "BaseBdev2", 00:18:16.875 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:16.875 "is_configured": true, 00:18:16.875 "data_offset": 0, 00:18:16.875 "data_size": 65536 00:18:16.875 }, 00:18:16.875 { 00:18:16.875 "name": "BaseBdev3", 00:18:16.875 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:16.875 "is_configured": true, 00:18:16.875 "data_offset": 0, 00:18:16.875 "data_size": 65536 00:18:16.875 }, 00:18:16.875 { 00:18:16.875 "name": "BaseBdev4", 00:18:16.875 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:16.875 "is_configured": true, 00:18:16.875 "data_offset": 0, 00:18:16.875 "data_size": 65536 00:18:16.875 } 00:18:16.875 ] 00:18:16.875 }' 00:18:16.875 00:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:16.875 00:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:17.442 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:17.442 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:17.442 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:17.442 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:17.442 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:17.442 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:17.442 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:17.442 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:17.702 [2024-07-16 00:13:04.495206] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:17.702 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:17.702 "name": "Existed_Raid", 00:18:17.702 "aliases": [ 00:18:17.702 "0abe50bb-c629-4386-90d3-fe14fe8ff615" 00:18:17.702 ], 00:18:17.702 "product_name": "Raid Volume", 00:18:17.702 "block_size": 512, 00:18:17.702 "num_blocks": 262144, 00:18:17.702 "uuid": "0abe50bb-c629-4386-90d3-fe14fe8ff615", 00:18:17.702 "assigned_rate_limits": { 00:18:17.702 "rw_ios_per_sec": 0, 00:18:17.702 "rw_mbytes_per_sec": 0, 00:18:17.702 "r_mbytes_per_sec": 0, 00:18:17.702 "w_mbytes_per_sec": 0 00:18:17.702 }, 00:18:17.702 "claimed": false, 00:18:17.702 "zoned": false, 00:18:17.702 "supported_io_types": { 00:18:17.702 "read": true, 00:18:17.702 "write": true, 00:18:17.702 "unmap": true, 00:18:17.702 "flush": true, 00:18:17.702 "reset": true, 00:18:17.702 "nvme_admin": false, 00:18:17.702 "nvme_io": false, 00:18:17.702 "nvme_io_md": false, 00:18:17.702 "write_zeroes": true, 00:18:17.702 "zcopy": false, 00:18:17.702 "get_zone_info": false, 00:18:17.702 "zone_management": false, 00:18:17.702 "zone_append": false, 00:18:17.702 "compare": false, 00:18:17.702 "compare_and_write": false, 00:18:17.702 "abort": false, 00:18:17.702 "seek_hole": false, 00:18:17.702 "seek_data": false, 00:18:17.702 "copy": false, 00:18:17.702 "nvme_iov_md": false 00:18:17.702 }, 00:18:17.702 "memory_domains": [ 00:18:17.702 { 00:18:17.702 "dma_device_id": "system", 00:18:17.702 "dma_device_type": 1 00:18:17.702 }, 00:18:17.702 { 00:18:17.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.702 "dma_device_type": 2 00:18:17.702 }, 00:18:17.702 { 00:18:17.702 "dma_device_id": "system", 00:18:17.702 "dma_device_type": 1 00:18:17.702 }, 00:18:17.702 { 00:18:17.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.702 "dma_device_type": 2 00:18:17.702 }, 00:18:17.702 { 00:18:17.702 "dma_device_id": "system", 00:18:17.702 "dma_device_type": 1 00:18:17.702 }, 00:18:17.702 { 00:18:17.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.702 "dma_device_type": 2 00:18:17.702 }, 00:18:17.702 { 00:18:17.702 "dma_device_id": "system", 00:18:17.702 "dma_device_type": 1 00:18:17.702 }, 00:18:17.702 { 00:18:17.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.702 "dma_device_type": 2 00:18:17.702 } 00:18:17.702 ], 00:18:17.702 "driver_specific": { 00:18:17.702 "raid": { 00:18:17.702 "uuid": "0abe50bb-c629-4386-90d3-fe14fe8ff615", 00:18:17.702 "strip_size_kb": 64, 00:18:17.702 "state": "online", 00:18:17.702 "raid_level": "raid0", 00:18:17.702 "superblock": false, 00:18:17.702 "num_base_bdevs": 4, 00:18:17.702 "num_base_bdevs_discovered": 4, 00:18:17.702 "num_base_bdevs_operational": 4, 00:18:17.702 "base_bdevs_list": [ 00:18:17.702 { 00:18:17.702 "name": "NewBaseBdev", 00:18:17.702 "uuid": "331ee72d-42bc-44a4-972b-5ee6e60292a3", 00:18:17.702 "is_configured": true, 00:18:17.702 "data_offset": 0, 00:18:17.702 "data_size": 65536 00:18:17.702 }, 00:18:17.702 { 00:18:17.702 "name": "BaseBdev2", 00:18:17.702 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:17.702 "is_configured": true, 00:18:17.702 "data_offset": 0, 00:18:17.702 "data_size": 65536 00:18:17.702 }, 00:18:17.702 { 00:18:17.702 "name": "BaseBdev3", 00:18:17.702 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:17.702 "is_configured": true, 00:18:17.702 "data_offset": 0, 00:18:17.702 "data_size": 65536 00:18:17.702 }, 00:18:17.702 { 00:18:17.702 "name": "BaseBdev4", 00:18:17.702 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:17.702 "is_configured": true, 00:18:17.702 "data_offset": 0, 00:18:17.702 "data_size": 65536 00:18:17.702 } 00:18:17.702 ] 00:18:17.702 } 00:18:17.702 } 00:18:17.702 }' 00:18:17.702 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:17.702 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:17.702 BaseBdev2 00:18:17.702 BaseBdev3 00:18:17.702 BaseBdev4' 00:18:17.702 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.702 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:17.702 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.962 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.962 "name": "NewBaseBdev", 00:18:17.962 "aliases": [ 00:18:17.962 "331ee72d-42bc-44a4-972b-5ee6e60292a3" 00:18:17.962 ], 00:18:17.962 "product_name": "Malloc disk", 00:18:17.962 "block_size": 512, 00:18:17.962 "num_blocks": 65536, 00:18:17.962 "uuid": "331ee72d-42bc-44a4-972b-5ee6e60292a3", 00:18:17.962 "assigned_rate_limits": { 00:18:17.962 "rw_ios_per_sec": 0, 00:18:17.962 "rw_mbytes_per_sec": 0, 00:18:17.962 "r_mbytes_per_sec": 0, 00:18:17.962 "w_mbytes_per_sec": 0 00:18:17.962 }, 00:18:17.962 "claimed": true, 00:18:17.962 "claim_type": "exclusive_write", 00:18:17.962 "zoned": false, 00:18:17.962 "supported_io_types": { 00:18:17.962 "read": true, 00:18:17.962 "write": true, 00:18:17.962 "unmap": true, 00:18:17.962 "flush": true, 00:18:17.962 "reset": true, 00:18:17.962 "nvme_admin": false, 00:18:17.962 "nvme_io": false, 00:18:17.962 "nvme_io_md": false, 00:18:17.962 "write_zeroes": true, 00:18:17.962 "zcopy": true, 00:18:17.962 "get_zone_info": false, 00:18:17.962 "zone_management": false, 00:18:17.962 "zone_append": false, 00:18:17.962 "compare": false, 00:18:17.962 "compare_and_write": false, 00:18:17.962 "abort": true, 00:18:17.962 "seek_hole": false, 00:18:17.962 "seek_data": false, 00:18:17.962 "copy": true, 00:18:17.962 "nvme_iov_md": false 00:18:17.962 }, 00:18:17.962 "memory_domains": [ 00:18:17.962 { 00:18:17.962 "dma_device_id": "system", 00:18:17.962 "dma_device_type": 1 00:18:17.962 }, 00:18:17.962 { 00:18:17.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.962 "dma_device_type": 2 00:18:17.962 } 00:18:17.962 ], 00:18:17.962 "driver_specific": {} 00:18:17.962 }' 00:18:17.962 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.962 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.962 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.962 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.222 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.222 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.222 00:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.222 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.222 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.222 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.222 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.482 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.482 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:18.482 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:18.482 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:18.744 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:18.744 "name": "BaseBdev2", 00:18:18.744 "aliases": [ 00:18:18.744 "fa0ac1b8-97fe-4b15-a5df-f395c89c083c" 00:18:18.744 ], 00:18:18.744 "product_name": "Malloc disk", 00:18:18.744 "block_size": 512, 00:18:18.744 "num_blocks": 65536, 00:18:18.744 "uuid": "fa0ac1b8-97fe-4b15-a5df-f395c89c083c", 00:18:18.744 "assigned_rate_limits": { 00:18:18.744 "rw_ios_per_sec": 0, 00:18:18.744 "rw_mbytes_per_sec": 0, 00:18:18.744 "r_mbytes_per_sec": 0, 00:18:18.744 "w_mbytes_per_sec": 0 00:18:18.744 }, 00:18:18.744 "claimed": true, 00:18:18.744 "claim_type": "exclusive_write", 00:18:18.744 "zoned": false, 00:18:18.744 "supported_io_types": { 00:18:18.744 "read": true, 00:18:18.744 "write": true, 00:18:18.744 "unmap": true, 00:18:18.744 "flush": true, 00:18:18.744 "reset": true, 00:18:18.744 "nvme_admin": false, 00:18:18.744 "nvme_io": false, 00:18:18.744 "nvme_io_md": false, 00:18:18.744 "write_zeroes": true, 00:18:18.744 "zcopy": true, 00:18:18.744 "get_zone_info": false, 00:18:18.744 "zone_management": false, 00:18:18.744 "zone_append": false, 00:18:18.744 "compare": false, 00:18:18.744 "compare_and_write": false, 00:18:18.744 "abort": true, 00:18:18.744 "seek_hole": false, 00:18:18.744 "seek_data": false, 00:18:18.744 "copy": true, 00:18:18.744 "nvme_iov_md": false 00:18:18.744 }, 00:18:18.744 "memory_domains": [ 00:18:18.744 { 00:18:18.744 "dma_device_id": "system", 00:18:18.744 "dma_device_type": 1 00:18:18.744 }, 00:18:18.744 { 00:18:18.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.744 "dma_device_type": 2 00:18:18.744 } 00:18:18.744 ], 00:18:18.744 "driver_specific": {} 00:18:18.744 }' 00:18:18.744 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.744 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.744 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.744 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.744 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.092 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:19.092 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.092 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.092 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:19.092 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.092 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.092 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.092 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:19.092 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:19.092 00:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:19.351 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:19.351 "name": "BaseBdev3", 00:18:19.351 "aliases": [ 00:18:19.351 "60608e22-306d-424e-94c9-bf38811e95c0" 00:18:19.351 ], 00:18:19.351 "product_name": "Malloc disk", 00:18:19.351 "block_size": 512, 00:18:19.351 "num_blocks": 65536, 00:18:19.351 "uuid": "60608e22-306d-424e-94c9-bf38811e95c0", 00:18:19.351 "assigned_rate_limits": { 00:18:19.351 "rw_ios_per_sec": 0, 00:18:19.351 "rw_mbytes_per_sec": 0, 00:18:19.351 "r_mbytes_per_sec": 0, 00:18:19.352 "w_mbytes_per_sec": 0 00:18:19.352 }, 00:18:19.352 "claimed": true, 00:18:19.352 "claim_type": "exclusive_write", 00:18:19.352 "zoned": false, 00:18:19.352 "supported_io_types": { 00:18:19.352 "read": true, 00:18:19.352 "write": true, 00:18:19.352 "unmap": true, 00:18:19.352 "flush": true, 00:18:19.352 "reset": true, 00:18:19.352 "nvme_admin": false, 00:18:19.352 "nvme_io": false, 00:18:19.352 "nvme_io_md": false, 00:18:19.352 "write_zeroes": true, 00:18:19.352 "zcopy": true, 00:18:19.352 "get_zone_info": false, 00:18:19.352 "zone_management": false, 00:18:19.352 "zone_append": false, 00:18:19.352 "compare": false, 00:18:19.352 "compare_and_write": false, 00:18:19.352 "abort": true, 00:18:19.352 "seek_hole": false, 00:18:19.352 "seek_data": false, 00:18:19.352 "copy": true, 00:18:19.352 "nvme_iov_md": false 00:18:19.352 }, 00:18:19.352 "memory_domains": [ 00:18:19.352 { 00:18:19.352 "dma_device_id": "system", 00:18:19.352 "dma_device_type": 1 00:18:19.352 }, 00:18:19.352 { 00:18:19.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.352 "dma_device_type": 2 00:18:19.352 } 00:18:19.352 ], 00:18:19.352 "driver_specific": {} 00:18:19.352 }' 00:18:19.352 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.352 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.352 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:19.352 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.352 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.611 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:19.611 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.611 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.611 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:19.611 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.611 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.611 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.611 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:19.611 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:19.611 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:19.880 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:19.880 "name": "BaseBdev4", 00:18:19.880 "aliases": [ 00:18:19.880 "8283a67f-6a86-4325-9294-40807898d2d6" 00:18:19.880 ], 00:18:19.880 "product_name": "Malloc disk", 00:18:19.880 "block_size": 512, 00:18:19.880 "num_blocks": 65536, 00:18:19.880 "uuid": "8283a67f-6a86-4325-9294-40807898d2d6", 00:18:19.880 "assigned_rate_limits": { 00:18:19.880 "rw_ios_per_sec": 0, 00:18:19.880 "rw_mbytes_per_sec": 0, 00:18:19.880 "r_mbytes_per_sec": 0, 00:18:19.880 "w_mbytes_per_sec": 0 00:18:19.880 }, 00:18:19.880 "claimed": true, 00:18:19.880 "claim_type": "exclusive_write", 00:18:19.880 "zoned": false, 00:18:19.880 "supported_io_types": { 00:18:19.880 "read": true, 00:18:19.880 "write": true, 00:18:19.880 "unmap": true, 00:18:19.880 "flush": true, 00:18:19.880 "reset": true, 00:18:19.880 "nvme_admin": false, 00:18:19.880 "nvme_io": false, 00:18:19.880 "nvme_io_md": false, 00:18:19.880 "write_zeroes": true, 00:18:19.880 "zcopy": true, 00:18:19.880 "get_zone_info": false, 00:18:19.880 "zone_management": false, 00:18:19.880 "zone_append": false, 00:18:19.880 "compare": false, 00:18:19.880 "compare_and_write": false, 00:18:19.880 "abort": true, 00:18:19.880 "seek_hole": false, 00:18:19.880 "seek_data": false, 00:18:19.880 "copy": true, 00:18:19.880 "nvme_iov_md": false 00:18:19.880 }, 00:18:19.880 "memory_domains": [ 00:18:19.880 { 00:18:19.880 "dma_device_id": "system", 00:18:19.880 "dma_device_type": 1 00:18:19.880 }, 00:18:19.880 { 00:18:19.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.880 "dma_device_type": 2 00:18:19.880 } 00:18:19.880 ], 00:18:19.880 "driver_specific": {} 00:18:19.880 }' 00:18:19.880 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.880 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:20.139 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:20.139 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.139 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.139 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:20.139 00:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.139 00:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.139 00:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:20.139 00:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.398 00:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.398 00:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:20.398 00:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:20.656 [2024-07-16 00:13:07.414657] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:20.656 [2024-07-16 00:13:07.414684] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:20.656 [2024-07-16 00:13:07.414732] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:20.656 [2024-07-16 00:13:07.414787] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:20.656 [2024-07-16 00:13:07.414799] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1692040 name Existed_Raid, state offline 00:18:20.656 00:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3551013 00:18:20.656 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3551013 ']' 00:18:20.656 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3551013 00:18:20.656 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:20.656 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:20.656 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3551013 00:18:20.656 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:20.656 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:20.656 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3551013' 00:18:20.657 killing process with pid 3551013 00:18:20.657 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3551013 00:18:20.657 [2024-07-16 00:13:07.484907] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:20.657 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3551013 00:18:20.657 [2024-07-16 00:13:07.521223] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:20.914 00:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:20.914 00:18:20.914 real 0m34.367s 00:18:20.914 user 1m3.146s 00:18:20.914 sys 0m6.143s 00:18:20.914 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:20.914 00:13:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.914 ************************************ 00:18:20.914 END TEST raid_state_function_test 00:18:20.914 ************************************ 00:18:20.914 00:13:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:20.914 00:13:07 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:20.914 00:13:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:20.914 00:13:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:20.914 00:13:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:20.914 ************************************ 00:18:20.914 START TEST raid_state_function_test_sb 00:18:20.914 ************************************ 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3556118 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3556118' 00:18:20.915 Process raid pid: 3556118 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3556118 /var/tmp/spdk-raid.sock 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3556118 ']' 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:20.915 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:20.915 00:13:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:21.172 [2024-07-16 00:13:07.870193] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:18:21.172 [2024-07-16 00:13:07.870258] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:21.172 [2024-07-16 00:13:07.999568] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:21.172 [2024-07-16 00:13:08.103803] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:21.430 [2024-07-16 00:13:08.172462] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:21.430 [2024-07-16 00:13:08.172499] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:21.997 00:13:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:21.997 00:13:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:21.997 00:13:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:22.256 [2024-07-16 00:13:09.028076] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:22.256 [2024-07-16 00:13:09.028120] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:22.256 [2024-07-16 00:13:09.028131] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:22.256 [2024-07-16 00:13:09.028143] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:22.256 [2024-07-16 00:13:09.028152] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:22.256 [2024-07-16 00:13:09.028163] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:22.256 [2024-07-16 00:13:09.028172] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:22.256 [2024-07-16 00:13:09.028183] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.256 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:22.515 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.515 "name": "Existed_Raid", 00:18:22.515 "uuid": "0c4e834c-6e0c-498e-b1ec-aeb528dc043f", 00:18:22.515 "strip_size_kb": 64, 00:18:22.515 "state": "configuring", 00:18:22.515 "raid_level": "raid0", 00:18:22.515 "superblock": true, 00:18:22.515 "num_base_bdevs": 4, 00:18:22.515 "num_base_bdevs_discovered": 0, 00:18:22.515 "num_base_bdevs_operational": 4, 00:18:22.515 "base_bdevs_list": [ 00:18:22.515 { 00:18:22.515 "name": "BaseBdev1", 00:18:22.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.515 "is_configured": false, 00:18:22.515 "data_offset": 0, 00:18:22.515 "data_size": 0 00:18:22.515 }, 00:18:22.515 { 00:18:22.515 "name": "BaseBdev2", 00:18:22.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.515 "is_configured": false, 00:18:22.515 "data_offset": 0, 00:18:22.515 "data_size": 0 00:18:22.515 }, 00:18:22.515 { 00:18:22.515 "name": "BaseBdev3", 00:18:22.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.515 "is_configured": false, 00:18:22.515 "data_offset": 0, 00:18:22.515 "data_size": 0 00:18:22.515 }, 00:18:22.515 { 00:18:22.515 "name": "BaseBdev4", 00:18:22.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.515 "is_configured": false, 00:18:22.515 "data_offset": 0, 00:18:22.515 "data_size": 0 00:18:22.515 } 00:18:22.515 ] 00:18:22.515 }' 00:18:22.515 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.515 00:13:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:23.084 00:13:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:23.343 [2024-07-16 00:13:10.158949] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:23.343 [2024-07-16 00:13:10.158982] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x139daa0 name Existed_Raid, state configuring 00:18:23.343 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:23.602 [2024-07-16 00:13:10.411638] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:23.602 [2024-07-16 00:13:10.411668] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:23.602 [2024-07-16 00:13:10.411677] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:23.602 [2024-07-16 00:13:10.411689] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:23.602 [2024-07-16 00:13:10.411698] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:23.602 [2024-07-16 00:13:10.411709] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:23.602 [2024-07-16 00:13:10.411718] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:23.602 [2024-07-16 00:13:10.411729] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:23.602 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:23.861 [2024-07-16 00:13:10.601934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:23.861 BaseBdev1 00:18:23.861 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:23.861 00:13:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:23.861 00:13:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:23.861 00:13:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:23.861 00:13:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:23.861 00:13:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:23.861 00:13:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:23.861 00:13:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:24.120 [ 00:18:24.120 { 00:18:24.120 "name": "BaseBdev1", 00:18:24.120 "aliases": [ 00:18:24.120 "759be746-882d-4510-b7b8-76315a618979" 00:18:24.120 ], 00:18:24.120 "product_name": "Malloc disk", 00:18:24.120 "block_size": 512, 00:18:24.120 "num_blocks": 65536, 00:18:24.120 "uuid": "759be746-882d-4510-b7b8-76315a618979", 00:18:24.120 "assigned_rate_limits": { 00:18:24.120 "rw_ios_per_sec": 0, 00:18:24.120 "rw_mbytes_per_sec": 0, 00:18:24.120 "r_mbytes_per_sec": 0, 00:18:24.120 "w_mbytes_per_sec": 0 00:18:24.120 }, 00:18:24.120 "claimed": true, 00:18:24.120 "claim_type": "exclusive_write", 00:18:24.120 "zoned": false, 00:18:24.120 "supported_io_types": { 00:18:24.120 "read": true, 00:18:24.120 "write": true, 00:18:24.120 "unmap": true, 00:18:24.120 "flush": true, 00:18:24.120 "reset": true, 00:18:24.120 "nvme_admin": false, 00:18:24.120 "nvme_io": false, 00:18:24.120 "nvme_io_md": false, 00:18:24.120 "write_zeroes": true, 00:18:24.120 "zcopy": true, 00:18:24.120 "get_zone_info": false, 00:18:24.120 "zone_management": false, 00:18:24.120 "zone_append": false, 00:18:24.120 "compare": false, 00:18:24.120 "compare_and_write": false, 00:18:24.120 "abort": true, 00:18:24.120 "seek_hole": false, 00:18:24.120 "seek_data": false, 00:18:24.120 "copy": true, 00:18:24.120 "nvme_iov_md": false 00:18:24.120 }, 00:18:24.120 "memory_domains": [ 00:18:24.120 { 00:18:24.120 "dma_device_id": "system", 00:18:24.120 "dma_device_type": 1 00:18:24.120 }, 00:18:24.120 { 00:18:24.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.120 "dma_device_type": 2 00:18:24.120 } 00:18:24.120 ], 00:18:24.120 "driver_specific": {} 00:18:24.120 } 00:18:24.120 ] 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.120 00:13:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:24.379 00:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.379 "name": "Existed_Raid", 00:18:24.379 "uuid": "1a9ef4c8-beec-45d9-b4f8-fc284e7502dd", 00:18:24.379 "strip_size_kb": 64, 00:18:24.379 "state": "configuring", 00:18:24.379 "raid_level": "raid0", 00:18:24.379 "superblock": true, 00:18:24.379 "num_base_bdevs": 4, 00:18:24.379 "num_base_bdevs_discovered": 1, 00:18:24.379 "num_base_bdevs_operational": 4, 00:18:24.379 "base_bdevs_list": [ 00:18:24.379 { 00:18:24.379 "name": "BaseBdev1", 00:18:24.379 "uuid": "759be746-882d-4510-b7b8-76315a618979", 00:18:24.379 "is_configured": true, 00:18:24.379 "data_offset": 2048, 00:18:24.379 "data_size": 63488 00:18:24.379 }, 00:18:24.379 { 00:18:24.379 "name": "BaseBdev2", 00:18:24.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.379 "is_configured": false, 00:18:24.379 "data_offset": 0, 00:18:24.379 "data_size": 0 00:18:24.379 }, 00:18:24.379 { 00:18:24.379 "name": "BaseBdev3", 00:18:24.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.379 "is_configured": false, 00:18:24.379 "data_offset": 0, 00:18:24.379 "data_size": 0 00:18:24.379 }, 00:18:24.379 { 00:18:24.379 "name": "BaseBdev4", 00:18:24.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.379 "is_configured": false, 00:18:24.379 "data_offset": 0, 00:18:24.379 "data_size": 0 00:18:24.379 } 00:18:24.379 ] 00:18:24.379 }' 00:18:24.379 00:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.379 00:13:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:24.946 00:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:25.205 [2024-07-16 00:13:11.973550] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:25.205 [2024-07-16 00:13:11.973586] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x139d310 name Existed_Raid, state configuring 00:18:25.205 00:13:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:25.205 [2024-07-16 00:13:12.138041] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:25.205 [2024-07-16 00:13:12.139566] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:25.205 [2024-07-16 00:13:12.139599] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:25.205 [2024-07-16 00:13:12.139609] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:25.205 [2024-07-16 00:13:12.139621] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:25.205 [2024-07-16 00:13:12.139630] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:25.205 [2024-07-16 00:13:12.139641] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.464 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.465 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.465 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.724 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.724 "name": "Existed_Raid", 00:18:25.724 "uuid": "dced88a7-4c83-4a36-b5a7-fcec868fa5f6", 00:18:25.724 "strip_size_kb": 64, 00:18:25.724 "state": "configuring", 00:18:25.724 "raid_level": "raid0", 00:18:25.724 "superblock": true, 00:18:25.724 "num_base_bdevs": 4, 00:18:25.724 "num_base_bdevs_discovered": 1, 00:18:25.724 "num_base_bdevs_operational": 4, 00:18:25.724 "base_bdevs_list": [ 00:18:25.724 { 00:18:25.724 "name": "BaseBdev1", 00:18:25.724 "uuid": "759be746-882d-4510-b7b8-76315a618979", 00:18:25.724 "is_configured": true, 00:18:25.724 "data_offset": 2048, 00:18:25.724 "data_size": 63488 00:18:25.724 }, 00:18:25.724 { 00:18:25.724 "name": "BaseBdev2", 00:18:25.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.724 "is_configured": false, 00:18:25.724 "data_offset": 0, 00:18:25.724 "data_size": 0 00:18:25.724 }, 00:18:25.724 { 00:18:25.724 "name": "BaseBdev3", 00:18:25.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.724 "is_configured": false, 00:18:25.724 "data_offset": 0, 00:18:25.724 "data_size": 0 00:18:25.724 }, 00:18:25.724 { 00:18:25.724 "name": "BaseBdev4", 00:18:25.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.724 "is_configured": false, 00:18:25.724 "data_offset": 0, 00:18:25.724 "data_size": 0 00:18:25.724 } 00:18:25.724 ] 00:18:25.724 }' 00:18:25.724 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.724 00:13:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:26.291 00:13:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:26.291 [2024-07-16 00:13:13.103940] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:26.291 BaseBdev2 00:18:26.291 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:26.291 00:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:26.291 00:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:26.291 00:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:26.291 00:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:26.291 00:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:26.291 00:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:26.549 00:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:26.549 [ 00:18:26.549 { 00:18:26.549 "name": "BaseBdev2", 00:18:26.549 "aliases": [ 00:18:26.549 "26336dd9-2b39-43d2-97e0-e57d4cc1abf0" 00:18:26.549 ], 00:18:26.550 "product_name": "Malloc disk", 00:18:26.550 "block_size": 512, 00:18:26.550 "num_blocks": 65536, 00:18:26.550 "uuid": "26336dd9-2b39-43d2-97e0-e57d4cc1abf0", 00:18:26.550 "assigned_rate_limits": { 00:18:26.550 "rw_ios_per_sec": 0, 00:18:26.550 "rw_mbytes_per_sec": 0, 00:18:26.550 "r_mbytes_per_sec": 0, 00:18:26.550 "w_mbytes_per_sec": 0 00:18:26.550 }, 00:18:26.550 "claimed": true, 00:18:26.550 "claim_type": "exclusive_write", 00:18:26.550 "zoned": false, 00:18:26.550 "supported_io_types": { 00:18:26.550 "read": true, 00:18:26.550 "write": true, 00:18:26.550 "unmap": true, 00:18:26.550 "flush": true, 00:18:26.550 "reset": true, 00:18:26.550 "nvme_admin": false, 00:18:26.550 "nvme_io": false, 00:18:26.550 "nvme_io_md": false, 00:18:26.550 "write_zeroes": true, 00:18:26.550 "zcopy": true, 00:18:26.550 "get_zone_info": false, 00:18:26.550 "zone_management": false, 00:18:26.550 "zone_append": false, 00:18:26.550 "compare": false, 00:18:26.550 "compare_and_write": false, 00:18:26.550 "abort": true, 00:18:26.550 "seek_hole": false, 00:18:26.550 "seek_data": false, 00:18:26.550 "copy": true, 00:18:26.550 "nvme_iov_md": false 00:18:26.550 }, 00:18:26.550 "memory_domains": [ 00:18:26.550 { 00:18:26.550 "dma_device_id": "system", 00:18:26.550 "dma_device_type": 1 00:18:26.550 }, 00:18:26.550 { 00:18:26.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:26.550 "dma_device_type": 2 00:18:26.550 } 00:18:26.550 ], 00:18:26.550 "driver_specific": {} 00:18:26.550 } 00:18:26.550 ] 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.550 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.808 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.808 "name": "Existed_Raid", 00:18:26.808 "uuid": "dced88a7-4c83-4a36-b5a7-fcec868fa5f6", 00:18:26.808 "strip_size_kb": 64, 00:18:26.808 "state": "configuring", 00:18:26.808 "raid_level": "raid0", 00:18:26.808 "superblock": true, 00:18:26.808 "num_base_bdevs": 4, 00:18:26.808 "num_base_bdevs_discovered": 2, 00:18:26.808 "num_base_bdevs_operational": 4, 00:18:26.808 "base_bdevs_list": [ 00:18:26.808 { 00:18:26.808 "name": "BaseBdev1", 00:18:26.808 "uuid": "759be746-882d-4510-b7b8-76315a618979", 00:18:26.808 "is_configured": true, 00:18:26.808 "data_offset": 2048, 00:18:26.808 "data_size": 63488 00:18:26.808 }, 00:18:26.808 { 00:18:26.808 "name": "BaseBdev2", 00:18:26.808 "uuid": "26336dd9-2b39-43d2-97e0-e57d4cc1abf0", 00:18:26.808 "is_configured": true, 00:18:26.808 "data_offset": 2048, 00:18:26.808 "data_size": 63488 00:18:26.808 }, 00:18:26.808 { 00:18:26.808 "name": "BaseBdev3", 00:18:26.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.808 "is_configured": false, 00:18:26.808 "data_offset": 0, 00:18:26.808 "data_size": 0 00:18:26.808 }, 00:18:26.808 { 00:18:26.808 "name": "BaseBdev4", 00:18:26.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.808 "is_configured": false, 00:18:26.808 "data_offset": 0, 00:18:26.808 "data_size": 0 00:18:26.808 } 00:18:26.808 ] 00:18:26.808 }' 00:18:26.808 00:13:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.808 00:13:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.743 00:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:27.743 [2024-07-16 00:13:14.575321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:27.743 BaseBdev3 00:18:27.743 00:13:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:27.743 00:13:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:27.743 00:13:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:27.743 00:13:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:27.743 00:13:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:27.743 00:13:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:27.743 00:13:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:28.001 00:13:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:28.259 [ 00:18:28.259 { 00:18:28.259 "name": "BaseBdev3", 00:18:28.259 "aliases": [ 00:18:28.259 "75eccbb2-9493-4ff1-b6fd-7448e4fccb27" 00:18:28.259 ], 00:18:28.259 "product_name": "Malloc disk", 00:18:28.259 "block_size": 512, 00:18:28.259 "num_blocks": 65536, 00:18:28.259 "uuid": "75eccbb2-9493-4ff1-b6fd-7448e4fccb27", 00:18:28.259 "assigned_rate_limits": { 00:18:28.259 "rw_ios_per_sec": 0, 00:18:28.259 "rw_mbytes_per_sec": 0, 00:18:28.259 "r_mbytes_per_sec": 0, 00:18:28.259 "w_mbytes_per_sec": 0 00:18:28.259 }, 00:18:28.259 "claimed": true, 00:18:28.259 "claim_type": "exclusive_write", 00:18:28.259 "zoned": false, 00:18:28.259 "supported_io_types": { 00:18:28.259 "read": true, 00:18:28.259 "write": true, 00:18:28.259 "unmap": true, 00:18:28.259 "flush": true, 00:18:28.259 "reset": true, 00:18:28.259 "nvme_admin": false, 00:18:28.259 "nvme_io": false, 00:18:28.260 "nvme_io_md": false, 00:18:28.260 "write_zeroes": true, 00:18:28.260 "zcopy": true, 00:18:28.260 "get_zone_info": false, 00:18:28.260 "zone_management": false, 00:18:28.260 "zone_append": false, 00:18:28.260 "compare": false, 00:18:28.260 "compare_and_write": false, 00:18:28.260 "abort": true, 00:18:28.260 "seek_hole": false, 00:18:28.260 "seek_data": false, 00:18:28.260 "copy": true, 00:18:28.260 "nvme_iov_md": false 00:18:28.260 }, 00:18:28.260 "memory_domains": [ 00:18:28.260 { 00:18:28.260 "dma_device_id": "system", 00:18:28.260 "dma_device_type": 1 00:18:28.260 }, 00:18:28.260 { 00:18:28.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.260 "dma_device_type": 2 00:18:28.260 } 00:18:28.260 ], 00:18:28.260 "driver_specific": {} 00:18:28.260 } 00:18:28.260 ] 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.260 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:28.519 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:28.519 "name": "Existed_Raid", 00:18:28.519 "uuid": "dced88a7-4c83-4a36-b5a7-fcec868fa5f6", 00:18:28.519 "strip_size_kb": 64, 00:18:28.519 "state": "configuring", 00:18:28.519 "raid_level": "raid0", 00:18:28.519 "superblock": true, 00:18:28.519 "num_base_bdevs": 4, 00:18:28.519 "num_base_bdevs_discovered": 3, 00:18:28.519 "num_base_bdevs_operational": 4, 00:18:28.519 "base_bdevs_list": [ 00:18:28.519 { 00:18:28.519 "name": "BaseBdev1", 00:18:28.519 "uuid": "759be746-882d-4510-b7b8-76315a618979", 00:18:28.519 "is_configured": true, 00:18:28.519 "data_offset": 2048, 00:18:28.519 "data_size": 63488 00:18:28.519 }, 00:18:28.519 { 00:18:28.519 "name": "BaseBdev2", 00:18:28.519 "uuid": "26336dd9-2b39-43d2-97e0-e57d4cc1abf0", 00:18:28.519 "is_configured": true, 00:18:28.519 "data_offset": 2048, 00:18:28.519 "data_size": 63488 00:18:28.519 }, 00:18:28.519 { 00:18:28.519 "name": "BaseBdev3", 00:18:28.519 "uuid": "75eccbb2-9493-4ff1-b6fd-7448e4fccb27", 00:18:28.519 "is_configured": true, 00:18:28.519 "data_offset": 2048, 00:18:28.519 "data_size": 63488 00:18:28.519 }, 00:18:28.519 { 00:18:28.519 "name": "BaseBdev4", 00:18:28.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:28.519 "is_configured": false, 00:18:28.519 "data_offset": 0, 00:18:28.519 "data_size": 0 00:18:28.519 } 00:18:28.519 ] 00:18:28.519 }' 00:18:28.519 00:13:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:28.519 00:13:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:29.086 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:29.345 [2024-07-16 00:13:16.247075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:29.345 [2024-07-16 00:13:16.247240] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x139e350 00:18:29.345 [2024-07-16 00:13:16.247255] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:29.345 [2024-07-16 00:13:16.247427] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x139e020 00:18:29.345 [2024-07-16 00:13:16.247545] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x139e350 00:18:29.345 [2024-07-16 00:13:16.247555] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x139e350 00:18:29.345 [2024-07-16 00:13:16.247645] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:29.345 BaseBdev4 00:18:29.345 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:29.345 00:13:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:29.345 00:13:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:29.345 00:13:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:29.345 00:13:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:29.345 00:13:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:29.345 00:13:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:29.603 00:13:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:29.862 [ 00:18:29.862 { 00:18:29.862 "name": "BaseBdev4", 00:18:29.862 "aliases": [ 00:18:29.862 "6ddd3836-5727-4a01-a26f-b23597f064b1" 00:18:29.862 ], 00:18:29.862 "product_name": "Malloc disk", 00:18:29.862 "block_size": 512, 00:18:29.862 "num_blocks": 65536, 00:18:29.862 "uuid": "6ddd3836-5727-4a01-a26f-b23597f064b1", 00:18:29.862 "assigned_rate_limits": { 00:18:29.862 "rw_ios_per_sec": 0, 00:18:29.862 "rw_mbytes_per_sec": 0, 00:18:29.862 "r_mbytes_per_sec": 0, 00:18:29.862 "w_mbytes_per_sec": 0 00:18:29.862 }, 00:18:29.862 "claimed": true, 00:18:29.862 "claim_type": "exclusive_write", 00:18:29.862 "zoned": false, 00:18:29.862 "supported_io_types": { 00:18:29.862 "read": true, 00:18:29.862 "write": true, 00:18:29.862 "unmap": true, 00:18:29.862 "flush": true, 00:18:29.862 "reset": true, 00:18:29.862 "nvme_admin": false, 00:18:29.862 "nvme_io": false, 00:18:29.862 "nvme_io_md": false, 00:18:29.862 "write_zeroes": true, 00:18:29.862 "zcopy": true, 00:18:29.862 "get_zone_info": false, 00:18:29.862 "zone_management": false, 00:18:29.862 "zone_append": false, 00:18:29.862 "compare": false, 00:18:29.862 "compare_and_write": false, 00:18:29.862 "abort": true, 00:18:29.862 "seek_hole": false, 00:18:29.862 "seek_data": false, 00:18:29.862 "copy": true, 00:18:29.862 "nvme_iov_md": false 00:18:29.862 }, 00:18:29.862 "memory_domains": [ 00:18:29.862 { 00:18:29.862 "dma_device_id": "system", 00:18:29.862 "dma_device_type": 1 00:18:29.862 }, 00:18:29.862 { 00:18:29.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.862 "dma_device_type": 2 00:18:29.862 } 00:18:29.862 ], 00:18:29.862 "driver_specific": {} 00:18:29.862 } 00:18:29.862 ] 00:18:29.862 00:13:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:29.862 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:29.862 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:29.862 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:29.862 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:29.862 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:29.862 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:29.862 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:29.862 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.863 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.863 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.863 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.863 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.863 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.863 00:13:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.121 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.121 "name": "Existed_Raid", 00:18:30.121 "uuid": "dced88a7-4c83-4a36-b5a7-fcec868fa5f6", 00:18:30.121 "strip_size_kb": 64, 00:18:30.121 "state": "online", 00:18:30.121 "raid_level": "raid0", 00:18:30.121 "superblock": true, 00:18:30.121 "num_base_bdevs": 4, 00:18:30.121 "num_base_bdevs_discovered": 4, 00:18:30.121 "num_base_bdevs_operational": 4, 00:18:30.121 "base_bdevs_list": [ 00:18:30.121 { 00:18:30.121 "name": "BaseBdev1", 00:18:30.121 "uuid": "759be746-882d-4510-b7b8-76315a618979", 00:18:30.121 "is_configured": true, 00:18:30.121 "data_offset": 2048, 00:18:30.121 "data_size": 63488 00:18:30.121 }, 00:18:30.121 { 00:18:30.121 "name": "BaseBdev2", 00:18:30.121 "uuid": "26336dd9-2b39-43d2-97e0-e57d4cc1abf0", 00:18:30.121 "is_configured": true, 00:18:30.121 "data_offset": 2048, 00:18:30.121 "data_size": 63488 00:18:30.121 }, 00:18:30.121 { 00:18:30.121 "name": "BaseBdev3", 00:18:30.121 "uuid": "75eccbb2-9493-4ff1-b6fd-7448e4fccb27", 00:18:30.121 "is_configured": true, 00:18:30.121 "data_offset": 2048, 00:18:30.121 "data_size": 63488 00:18:30.121 }, 00:18:30.121 { 00:18:30.121 "name": "BaseBdev4", 00:18:30.121 "uuid": "6ddd3836-5727-4a01-a26f-b23597f064b1", 00:18:30.121 "is_configured": true, 00:18:30.121 "data_offset": 2048, 00:18:30.121 "data_size": 63488 00:18:30.121 } 00:18:30.121 ] 00:18:30.121 }' 00:18:30.121 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.121 00:13:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:30.686 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:30.687 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:30.687 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:30.687 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:30.687 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:30.687 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:30.687 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:30.687 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:30.945 [2024-07-16 00:13:17.763438] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:30.945 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:30.945 "name": "Existed_Raid", 00:18:30.945 "aliases": [ 00:18:30.945 "dced88a7-4c83-4a36-b5a7-fcec868fa5f6" 00:18:30.945 ], 00:18:30.945 "product_name": "Raid Volume", 00:18:30.945 "block_size": 512, 00:18:30.945 "num_blocks": 253952, 00:18:30.945 "uuid": "dced88a7-4c83-4a36-b5a7-fcec868fa5f6", 00:18:30.945 "assigned_rate_limits": { 00:18:30.945 "rw_ios_per_sec": 0, 00:18:30.945 "rw_mbytes_per_sec": 0, 00:18:30.945 "r_mbytes_per_sec": 0, 00:18:30.946 "w_mbytes_per_sec": 0 00:18:30.946 }, 00:18:30.946 "claimed": false, 00:18:30.946 "zoned": false, 00:18:30.946 "supported_io_types": { 00:18:30.946 "read": true, 00:18:30.946 "write": true, 00:18:30.946 "unmap": true, 00:18:30.946 "flush": true, 00:18:30.946 "reset": true, 00:18:30.946 "nvme_admin": false, 00:18:30.946 "nvme_io": false, 00:18:30.946 "nvme_io_md": false, 00:18:30.946 "write_zeroes": true, 00:18:30.946 "zcopy": false, 00:18:30.946 "get_zone_info": false, 00:18:30.946 "zone_management": false, 00:18:30.946 "zone_append": false, 00:18:30.946 "compare": false, 00:18:30.946 "compare_and_write": false, 00:18:30.946 "abort": false, 00:18:30.946 "seek_hole": false, 00:18:30.946 "seek_data": false, 00:18:30.946 "copy": false, 00:18:30.946 "nvme_iov_md": false 00:18:30.946 }, 00:18:30.946 "memory_domains": [ 00:18:30.946 { 00:18:30.946 "dma_device_id": "system", 00:18:30.946 "dma_device_type": 1 00:18:30.946 }, 00:18:30.946 { 00:18:30.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.946 "dma_device_type": 2 00:18:30.946 }, 00:18:30.946 { 00:18:30.946 "dma_device_id": "system", 00:18:30.946 "dma_device_type": 1 00:18:30.946 }, 00:18:30.946 { 00:18:30.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.946 "dma_device_type": 2 00:18:30.946 }, 00:18:30.946 { 00:18:30.946 "dma_device_id": "system", 00:18:30.946 "dma_device_type": 1 00:18:30.946 }, 00:18:30.946 { 00:18:30.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.946 "dma_device_type": 2 00:18:30.946 }, 00:18:30.946 { 00:18:30.946 "dma_device_id": "system", 00:18:30.946 "dma_device_type": 1 00:18:30.946 }, 00:18:30.946 { 00:18:30.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.946 "dma_device_type": 2 00:18:30.946 } 00:18:30.946 ], 00:18:30.946 "driver_specific": { 00:18:30.946 "raid": { 00:18:30.946 "uuid": "dced88a7-4c83-4a36-b5a7-fcec868fa5f6", 00:18:30.946 "strip_size_kb": 64, 00:18:30.946 "state": "online", 00:18:30.946 "raid_level": "raid0", 00:18:30.946 "superblock": true, 00:18:30.946 "num_base_bdevs": 4, 00:18:30.946 "num_base_bdevs_discovered": 4, 00:18:30.946 "num_base_bdevs_operational": 4, 00:18:30.946 "base_bdevs_list": [ 00:18:30.946 { 00:18:30.946 "name": "BaseBdev1", 00:18:30.946 "uuid": "759be746-882d-4510-b7b8-76315a618979", 00:18:30.946 "is_configured": true, 00:18:30.946 "data_offset": 2048, 00:18:30.946 "data_size": 63488 00:18:30.946 }, 00:18:30.946 { 00:18:30.946 "name": "BaseBdev2", 00:18:30.946 "uuid": "26336dd9-2b39-43d2-97e0-e57d4cc1abf0", 00:18:30.946 "is_configured": true, 00:18:30.946 "data_offset": 2048, 00:18:30.946 "data_size": 63488 00:18:30.946 }, 00:18:30.946 { 00:18:30.946 "name": "BaseBdev3", 00:18:30.946 "uuid": "75eccbb2-9493-4ff1-b6fd-7448e4fccb27", 00:18:30.946 "is_configured": true, 00:18:30.946 "data_offset": 2048, 00:18:30.946 "data_size": 63488 00:18:30.946 }, 00:18:30.946 { 00:18:30.946 "name": "BaseBdev4", 00:18:30.946 "uuid": "6ddd3836-5727-4a01-a26f-b23597f064b1", 00:18:30.946 "is_configured": true, 00:18:30.946 "data_offset": 2048, 00:18:30.946 "data_size": 63488 00:18:30.946 } 00:18:30.946 ] 00:18:30.946 } 00:18:30.946 } 00:18:30.946 }' 00:18:30.946 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:30.946 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:30.946 BaseBdev2 00:18:30.946 BaseBdev3 00:18:30.946 BaseBdev4' 00:18:30.946 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:30.946 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:30.946 00:13:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:31.204 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:31.204 "name": "BaseBdev1", 00:18:31.204 "aliases": [ 00:18:31.204 "759be746-882d-4510-b7b8-76315a618979" 00:18:31.204 ], 00:18:31.204 "product_name": "Malloc disk", 00:18:31.204 "block_size": 512, 00:18:31.204 "num_blocks": 65536, 00:18:31.204 "uuid": "759be746-882d-4510-b7b8-76315a618979", 00:18:31.204 "assigned_rate_limits": { 00:18:31.204 "rw_ios_per_sec": 0, 00:18:31.204 "rw_mbytes_per_sec": 0, 00:18:31.204 "r_mbytes_per_sec": 0, 00:18:31.204 "w_mbytes_per_sec": 0 00:18:31.204 }, 00:18:31.204 "claimed": true, 00:18:31.204 "claim_type": "exclusive_write", 00:18:31.204 "zoned": false, 00:18:31.204 "supported_io_types": { 00:18:31.204 "read": true, 00:18:31.204 "write": true, 00:18:31.204 "unmap": true, 00:18:31.204 "flush": true, 00:18:31.204 "reset": true, 00:18:31.204 "nvme_admin": false, 00:18:31.204 "nvme_io": false, 00:18:31.204 "nvme_io_md": false, 00:18:31.204 "write_zeroes": true, 00:18:31.204 "zcopy": true, 00:18:31.204 "get_zone_info": false, 00:18:31.204 "zone_management": false, 00:18:31.204 "zone_append": false, 00:18:31.204 "compare": false, 00:18:31.204 "compare_and_write": false, 00:18:31.204 "abort": true, 00:18:31.204 "seek_hole": false, 00:18:31.204 "seek_data": false, 00:18:31.204 "copy": true, 00:18:31.204 "nvme_iov_md": false 00:18:31.204 }, 00:18:31.204 "memory_domains": [ 00:18:31.204 { 00:18:31.204 "dma_device_id": "system", 00:18:31.204 "dma_device_type": 1 00:18:31.204 }, 00:18:31.204 { 00:18:31.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.204 "dma_device_type": 2 00:18:31.204 } 00:18:31.204 ], 00:18:31.204 "driver_specific": {} 00:18:31.204 }' 00:18:31.204 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.204 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.204 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:31.204 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.462 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.462 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:31.462 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.462 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.462 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:31.462 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.462 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.720 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:31.720 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:31.720 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:31.720 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:31.977 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:31.977 "name": "BaseBdev2", 00:18:31.977 "aliases": [ 00:18:31.977 "26336dd9-2b39-43d2-97e0-e57d4cc1abf0" 00:18:31.977 ], 00:18:31.977 "product_name": "Malloc disk", 00:18:31.977 "block_size": 512, 00:18:31.977 "num_blocks": 65536, 00:18:31.977 "uuid": "26336dd9-2b39-43d2-97e0-e57d4cc1abf0", 00:18:31.977 "assigned_rate_limits": { 00:18:31.977 "rw_ios_per_sec": 0, 00:18:31.977 "rw_mbytes_per_sec": 0, 00:18:31.977 "r_mbytes_per_sec": 0, 00:18:31.977 "w_mbytes_per_sec": 0 00:18:31.977 }, 00:18:31.977 "claimed": true, 00:18:31.977 "claim_type": "exclusive_write", 00:18:31.977 "zoned": false, 00:18:31.977 "supported_io_types": { 00:18:31.977 "read": true, 00:18:31.977 "write": true, 00:18:31.977 "unmap": true, 00:18:31.977 "flush": true, 00:18:31.977 "reset": true, 00:18:31.977 "nvme_admin": false, 00:18:31.977 "nvme_io": false, 00:18:31.977 "nvme_io_md": false, 00:18:31.977 "write_zeroes": true, 00:18:31.977 "zcopy": true, 00:18:31.977 "get_zone_info": false, 00:18:31.977 "zone_management": false, 00:18:31.977 "zone_append": false, 00:18:31.977 "compare": false, 00:18:31.977 "compare_and_write": false, 00:18:31.977 "abort": true, 00:18:31.977 "seek_hole": false, 00:18:31.977 "seek_data": false, 00:18:31.977 "copy": true, 00:18:31.977 "nvme_iov_md": false 00:18:31.977 }, 00:18:31.977 "memory_domains": [ 00:18:31.977 { 00:18:31.977 "dma_device_id": "system", 00:18:31.977 "dma_device_type": 1 00:18:31.977 }, 00:18:31.977 { 00:18:31.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.977 "dma_device_type": 2 00:18:31.977 } 00:18:31.977 ], 00:18:31.977 "driver_specific": {} 00:18:31.977 }' 00:18:31.977 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.977 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.977 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:31.977 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.977 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.977 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:31.977 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.977 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.234 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:32.234 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.234 00:13:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.234 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:32.234 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:32.234 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:32.234 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:32.491 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:32.491 "name": "BaseBdev3", 00:18:32.491 "aliases": [ 00:18:32.491 "75eccbb2-9493-4ff1-b6fd-7448e4fccb27" 00:18:32.491 ], 00:18:32.491 "product_name": "Malloc disk", 00:18:32.491 "block_size": 512, 00:18:32.491 "num_blocks": 65536, 00:18:32.491 "uuid": "75eccbb2-9493-4ff1-b6fd-7448e4fccb27", 00:18:32.491 "assigned_rate_limits": { 00:18:32.491 "rw_ios_per_sec": 0, 00:18:32.491 "rw_mbytes_per_sec": 0, 00:18:32.491 "r_mbytes_per_sec": 0, 00:18:32.491 "w_mbytes_per_sec": 0 00:18:32.491 }, 00:18:32.491 "claimed": true, 00:18:32.491 "claim_type": "exclusive_write", 00:18:32.491 "zoned": false, 00:18:32.491 "supported_io_types": { 00:18:32.491 "read": true, 00:18:32.491 "write": true, 00:18:32.491 "unmap": true, 00:18:32.491 "flush": true, 00:18:32.491 "reset": true, 00:18:32.491 "nvme_admin": false, 00:18:32.491 "nvme_io": false, 00:18:32.491 "nvme_io_md": false, 00:18:32.491 "write_zeroes": true, 00:18:32.491 "zcopy": true, 00:18:32.491 "get_zone_info": false, 00:18:32.491 "zone_management": false, 00:18:32.491 "zone_append": false, 00:18:32.491 "compare": false, 00:18:32.491 "compare_and_write": false, 00:18:32.491 "abort": true, 00:18:32.491 "seek_hole": false, 00:18:32.491 "seek_data": false, 00:18:32.491 "copy": true, 00:18:32.491 "nvme_iov_md": false 00:18:32.491 }, 00:18:32.491 "memory_domains": [ 00:18:32.491 { 00:18:32.491 "dma_device_id": "system", 00:18:32.491 "dma_device_type": 1 00:18:32.491 }, 00:18:32.491 { 00:18:32.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.491 "dma_device_type": 2 00:18:32.491 } 00:18:32.491 ], 00:18:32.491 "driver_specific": {} 00:18:32.491 }' 00:18:32.491 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.491 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.491 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:32.491 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.491 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.748 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:32.748 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.748 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.748 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:32.748 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.748 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.748 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:32.748 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:32.748 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:32.749 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:33.053 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:33.053 "name": "BaseBdev4", 00:18:33.053 "aliases": [ 00:18:33.053 "6ddd3836-5727-4a01-a26f-b23597f064b1" 00:18:33.053 ], 00:18:33.053 "product_name": "Malloc disk", 00:18:33.053 "block_size": 512, 00:18:33.053 "num_blocks": 65536, 00:18:33.053 "uuid": "6ddd3836-5727-4a01-a26f-b23597f064b1", 00:18:33.053 "assigned_rate_limits": { 00:18:33.053 "rw_ios_per_sec": 0, 00:18:33.053 "rw_mbytes_per_sec": 0, 00:18:33.053 "r_mbytes_per_sec": 0, 00:18:33.053 "w_mbytes_per_sec": 0 00:18:33.053 }, 00:18:33.053 "claimed": true, 00:18:33.053 "claim_type": "exclusive_write", 00:18:33.053 "zoned": false, 00:18:33.053 "supported_io_types": { 00:18:33.053 "read": true, 00:18:33.053 "write": true, 00:18:33.053 "unmap": true, 00:18:33.053 "flush": true, 00:18:33.053 "reset": true, 00:18:33.053 "nvme_admin": false, 00:18:33.053 "nvme_io": false, 00:18:33.053 "nvme_io_md": false, 00:18:33.053 "write_zeroes": true, 00:18:33.053 "zcopy": true, 00:18:33.053 "get_zone_info": false, 00:18:33.053 "zone_management": false, 00:18:33.053 "zone_append": false, 00:18:33.053 "compare": false, 00:18:33.053 "compare_and_write": false, 00:18:33.053 "abort": true, 00:18:33.053 "seek_hole": false, 00:18:33.053 "seek_data": false, 00:18:33.053 "copy": true, 00:18:33.053 "nvme_iov_md": false 00:18:33.053 }, 00:18:33.053 "memory_domains": [ 00:18:33.053 { 00:18:33.053 "dma_device_id": "system", 00:18:33.053 "dma_device_type": 1 00:18:33.053 }, 00:18:33.053 { 00:18:33.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.053 "dma_device_type": 2 00:18:33.053 } 00:18:33.053 ], 00:18:33.053 "driver_specific": {} 00:18:33.053 }' 00:18:33.053 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.054 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.320 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:33.320 00:13:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.320 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.320 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:33.320 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.320 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.320 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:33.320 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.320 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:33.578 [2024-07-16 00:13:20.498483] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:33.578 [2024-07-16 00:13:20.498511] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:33.578 [2024-07-16 00:13:20.498558] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.578 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.836 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.836 "name": "Existed_Raid", 00:18:33.836 "uuid": "dced88a7-4c83-4a36-b5a7-fcec868fa5f6", 00:18:33.836 "strip_size_kb": 64, 00:18:33.836 "state": "offline", 00:18:33.836 "raid_level": "raid0", 00:18:33.836 "superblock": true, 00:18:33.836 "num_base_bdevs": 4, 00:18:33.836 "num_base_bdevs_discovered": 3, 00:18:33.836 "num_base_bdevs_operational": 3, 00:18:33.836 "base_bdevs_list": [ 00:18:33.836 { 00:18:33.836 "name": null, 00:18:33.836 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.836 "is_configured": false, 00:18:33.836 "data_offset": 2048, 00:18:33.836 "data_size": 63488 00:18:33.836 }, 00:18:33.836 { 00:18:33.836 "name": "BaseBdev2", 00:18:33.836 "uuid": "26336dd9-2b39-43d2-97e0-e57d4cc1abf0", 00:18:33.836 "is_configured": true, 00:18:33.836 "data_offset": 2048, 00:18:33.836 "data_size": 63488 00:18:33.836 }, 00:18:33.836 { 00:18:33.836 "name": "BaseBdev3", 00:18:33.836 "uuid": "75eccbb2-9493-4ff1-b6fd-7448e4fccb27", 00:18:33.836 "is_configured": true, 00:18:33.836 "data_offset": 2048, 00:18:33.836 "data_size": 63488 00:18:33.836 }, 00:18:33.836 { 00:18:33.836 "name": "BaseBdev4", 00:18:33.836 "uuid": "6ddd3836-5727-4a01-a26f-b23597f064b1", 00:18:33.837 "is_configured": true, 00:18:33.837 "data_offset": 2048, 00:18:33.837 "data_size": 63488 00:18:33.837 } 00:18:33.837 ] 00:18:33.837 }' 00:18:33.837 00:13:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.837 00:13:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:34.772 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:34.772 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:34.772 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:34.772 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.772 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:34.772 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:34.772 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:35.030 [2024-07-16 00:13:21.871147] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:35.030 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:35.030 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:35.030 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.030 00:13:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:35.287 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:35.287 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:35.287 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:35.545 [2024-07-16 00:13:22.375323] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:35.545 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:35.545 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:35.545 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.545 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:35.803 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:35.803 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:35.803 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:36.061 [2024-07-16 00:13:22.875236] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:36.061 [2024-07-16 00:13:22.875279] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x139e350 name Existed_Raid, state offline 00:18:36.061 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:36.061 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:36.061 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:36.061 00:13:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.319 00:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:36.319 00:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:36.319 00:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:36.319 00:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:36.319 00:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:36.319 00:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:36.578 BaseBdev2 00:18:36.578 00:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:36.578 00:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:36.578 00:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:36.578 00:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:36.578 00:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:36.578 00:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:36.578 00:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:36.836 00:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:37.093 [ 00:18:37.093 { 00:18:37.093 "name": "BaseBdev2", 00:18:37.093 "aliases": [ 00:18:37.093 "a3b40150-69ab-4e47-b621-26f38d291950" 00:18:37.093 ], 00:18:37.093 "product_name": "Malloc disk", 00:18:37.093 "block_size": 512, 00:18:37.093 "num_blocks": 65536, 00:18:37.093 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:37.093 "assigned_rate_limits": { 00:18:37.093 "rw_ios_per_sec": 0, 00:18:37.093 "rw_mbytes_per_sec": 0, 00:18:37.093 "r_mbytes_per_sec": 0, 00:18:37.093 "w_mbytes_per_sec": 0 00:18:37.093 }, 00:18:37.093 "claimed": false, 00:18:37.093 "zoned": false, 00:18:37.093 "supported_io_types": { 00:18:37.093 "read": true, 00:18:37.093 "write": true, 00:18:37.093 "unmap": true, 00:18:37.093 "flush": true, 00:18:37.093 "reset": true, 00:18:37.093 "nvme_admin": false, 00:18:37.093 "nvme_io": false, 00:18:37.093 "nvme_io_md": false, 00:18:37.093 "write_zeroes": true, 00:18:37.093 "zcopy": true, 00:18:37.093 "get_zone_info": false, 00:18:37.093 "zone_management": false, 00:18:37.093 "zone_append": false, 00:18:37.093 "compare": false, 00:18:37.093 "compare_and_write": false, 00:18:37.093 "abort": true, 00:18:37.093 "seek_hole": false, 00:18:37.093 "seek_data": false, 00:18:37.093 "copy": true, 00:18:37.093 "nvme_iov_md": false 00:18:37.093 }, 00:18:37.094 "memory_domains": [ 00:18:37.094 { 00:18:37.094 "dma_device_id": "system", 00:18:37.094 "dma_device_type": 1 00:18:37.094 }, 00:18:37.094 { 00:18:37.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.094 "dma_device_type": 2 00:18:37.094 } 00:18:37.094 ], 00:18:37.094 "driver_specific": {} 00:18:37.094 } 00:18:37.094 ] 00:18:37.094 00:13:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:37.094 00:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:37.094 00:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:37.094 00:13:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:37.352 BaseBdev3 00:18:37.352 00:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:37.352 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:37.352 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:37.352 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:37.352 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:37.352 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:37.352 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:37.610 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:37.868 [ 00:18:37.868 { 00:18:37.868 "name": "BaseBdev3", 00:18:37.868 "aliases": [ 00:18:37.868 "9928157b-a056-4fe7-b901-c3db0ca4f327" 00:18:37.868 ], 00:18:37.868 "product_name": "Malloc disk", 00:18:37.868 "block_size": 512, 00:18:37.868 "num_blocks": 65536, 00:18:37.868 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:37.868 "assigned_rate_limits": { 00:18:37.868 "rw_ios_per_sec": 0, 00:18:37.868 "rw_mbytes_per_sec": 0, 00:18:37.868 "r_mbytes_per_sec": 0, 00:18:37.868 "w_mbytes_per_sec": 0 00:18:37.868 }, 00:18:37.868 "claimed": false, 00:18:37.868 "zoned": false, 00:18:37.868 "supported_io_types": { 00:18:37.868 "read": true, 00:18:37.868 "write": true, 00:18:37.868 "unmap": true, 00:18:37.868 "flush": true, 00:18:37.868 "reset": true, 00:18:37.868 "nvme_admin": false, 00:18:37.868 "nvme_io": false, 00:18:37.868 "nvme_io_md": false, 00:18:37.868 "write_zeroes": true, 00:18:37.868 "zcopy": true, 00:18:37.868 "get_zone_info": false, 00:18:37.868 "zone_management": false, 00:18:37.868 "zone_append": false, 00:18:37.868 "compare": false, 00:18:37.868 "compare_and_write": false, 00:18:37.868 "abort": true, 00:18:37.868 "seek_hole": false, 00:18:37.868 "seek_data": false, 00:18:37.868 "copy": true, 00:18:37.869 "nvme_iov_md": false 00:18:37.869 }, 00:18:37.869 "memory_domains": [ 00:18:37.869 { 00:18:37.869 "dma_device_id": "system", 00:18:37.869 "dma_device_type": 1 00:18:37.869 }, 00:18:37.869 { 00:18:37.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.869 "dma_device_type": 2 00:18:37.869 } 00:18:37.869 ], 00:18:37.869 "driver_specific": {} 00:18:37.869 } 00:18:37.869 ] 00:18:37.869 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:37.869 00:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:37.869 00:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:37.869 00:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:38.126 BaseBdev4 00:18:38.126 00:13:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:38.126 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:38.126 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:38.126 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:38.126 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:38.126 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:38.126 00:13:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:38.384 00:13:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:38.384 [ 00:18:38.384 { 00:18:38.384 "name": "BaseBdev4", 00:18:38.384 "aliases": [ 00:18:38.384 "893b72c7-553e-40e6-b142-f085e134d502" 00:18:38.384 ], 00:18:38.384 "product_name": "Malloc disk", 00:18:38.384 "block_size": 512, 00:18:38.384 "num_blocks": 65536, 00:18:38.384 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:38.384 "assigned_rate_limits": { 00:18:38.384 "rw_ios_per_sec": 0, 00:18:38.384 "rw_mbytes_per_sec": 0, 00:18:38.384 "r_mbytes_per_sec": 0, 00:18:38.384 "w_mbytes_per_sec": 0 00:18:38.384 }, 00:18:38.384 "claimed": false, 00:18:38.384 "zoned": false, 00:18:38.384 "supported_io_types": { 00:18:38.384 "read": true, 00:18:38.384 "write": true, 00:18:38.384 "unmap": true, 00:18:38.384 "flush": true, 00:18:38.384 "reset": true, 00:18:38.384 "nvme_admin": false, 00:18:38.384 "nvme_io": false, 00:18:38.384 "nvme_io_md": false, 00:18:38.384 "write_zeroes": true, 00:18:38.384 "zcopy": true, 00:18:38.384 "get_zone_info": false, 00:18:38.384 "zone_management": false, 00:18:38.384 "zone_append": false, 00:18:38.384 "compare": false, 00:18:38.384 "compare_and_write": false, 00:18:38.384 "abort": true, 00:18:38.385 "seek_hole": false, 00:18:38.385 "seek_data": false, 00:18:38.385 "copy": true, 00:18:38.385 "nvme_iov_md": false 00:18:38.385 }, 00:18:38.385 "memory_domains": [ 00:18:38.385 { 00:18:38.385 "dma_device_id": "system", 00:18:38.385 "dma_device_type": 1 00:18:38.385 }, 00:18:38.385 { 00:18:38.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.385 "dma_device_type": 2 00:18:38.385 } 00:18:38.385 ], 00:18:38.385 "driver_specific": {} 00:18:38.385 } 00:18:38.385 ] 00:18:38.643 00:13:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:38.643 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:38.643 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:38.643 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:38.643 [2024-07-16 00:13:25.573037] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:38.643 [2024-07-16 00:13:25.573077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:38.643 [2024-07-16 00:13:25.573098] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:38.643 [2024-07-16 00:13:25.574419] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:38.643 [2024-07-16 00:13:25.574460] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:38.643 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:38.643 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:38.643 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:38.643 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:38.901 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:38.901 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:38.902 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:38.902 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:38.902 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:38.902 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:38.902 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.902 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:38.902 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:38.902 "name": "Existed_Raid", 00:18:38.902 "uuid": "63787e5a-63d7-4232-8d61-cc568d870f7f", 00:18:38.902 "strip_size_kb": 64, 00:18:38.902 "state": "configuring", 00:18:38.902 "raid_level": "raid0", 00:18:38.902 "superblock": true, 00:18:38.902 "num_base_bdevs": 4, 00:18:38.902 "num_base_bdevs_discovered": 3, 00:18:38.902 "num_base_bdevs_operational": 4, 00:18:38.902 "base_bdevs_list": [ 00:18:38.902 { 00:18:38.902 "name": "BaseBdev1", 00:18:38.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:38.902 "is_configured": false, 00:18:38.902 "data_offset": 0, 00:18:38.902 "data_size": 0 00:18:38.902 }, 00:18:38.902 { 00:18:38.902 "name": "BaseBdev2", 00:18:38.902 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:38.902 "is_configured": true, 00:18:38.902 "data_offset": 2048, 00:18:38.902 "data_size": 63488 00:18:38.902 }, 00:18:38.902 { 00:18:38.902 "name": "BaseBdev3", 00:18:38.902 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:38.902 "is_configured": true, 00:18:38.902 "data_offset": 2048, 00:18:38.902 "data_size": 63488 00:18:38.902 }, 00:18:38.902 { 00:18:38.902 "name": "BaseBdev4", 00:18:38.902 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:38.902 "is_configured": true, 00:18:38.902 "data_offset": 2048, 00:18:38.902 "data_size": 63488 00:18:38.902 } 00:18:38.902 ] 00:18:38.902 }' 00:18:38.902 00:13:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:38.902 00:13:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:39.834 [2024-07-16 00:13:26.671911] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.834 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:40.092 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:40.092 "name": "Existed_Raid", 00:18:40.092 "uuid": "63787e5a-63d7-4232-8d61-cc568d870f7f", 00:18:40.092 "strip_size_kb": 64, 00:18:40.092 "state": "configuring", 00:18:40.092 "raid_level": "raid0", 00:18:40.092 "superblock": true, 00:18:40.092 "num_base_bdevs": 4, 00:18:40.092 "num_base_bdevs_discovered": 2, 00:18:40.092 "num_base_bdevs_operational": 4, 00:18:40.092 "base_bdevs_list": [ 00:18:40.092 { 00:18:40.092 "name": "BaseBdev1", 00:18:40.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:40.092 "is_configured": false, 00:18:40.092 "data_offset": 0, 00:18:40.092 "data_size": 0 00:18:40.092 }, 00:18:40.092 { 00:18:40.092 "name": null, 00:18:40.092 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:40.092 "is_configured": false, 00:18:40.092 "data_offset": 2048, 00:18:40.092 "data_size": 63488 00:18:40.092 }, 00:18:40.092 { 00:18:40.092 "name": "BaseBdev3", 00:18:40.092 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:40.092 "is_configured": true, 00:18:40.092 "data_offset": 2048, 00:18:40.092 "data_size": 63488 00:18:40.092 }, 00:18:40.092 { 00:18:40.092 "name": "BaseBdev4", 00:18:40.092 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:40.092 "is_configured": true, 00:18:40.092 "data_offset": 2048, 00:18:40.092 "data_size": 63488 00:18:40.092 } 00:18:40.092 ] 00:18:40.092 }' 00:18:40.092 00:13:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:40.092 00:13:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:40.658 00:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.658 00:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:40.917 00:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:40.917 00:13:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:41.176 [2024-07-16 00:13:28.036122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:41.176 BaseBdev1 00:18:41.176 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:41.176 00:13:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:41.176 00:13:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:41.176 00:13:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:41.176 00:13:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:41.176 00:13:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:41.176 00:13:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:41.435 00:13:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:41.693 [ 00:18:41.693 { 00:18:41.693 "name": "BaseBdev1", 00:18:41.693 "aliases": [ 00:18:41.693 "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8" 00:18:41.693 ], 00:18:41.693 "product_name": "Malloc disk", 00:18:41.693 "block_size": 512, 00:18:41.693 "num_blocks": 65536, 00:18:41.693 "uuid": "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8", 00:18:41.693 "assigned_rate_limits": { 00:18:41.693 "rw_ios_per_sec": 0, 00:18:41.693 "rw_mbytes_per_sec": 0, 00:18:41.693 "r_mbytes_per_sec": 0, 00:18:41.693 "w_mbytes_per_sec": 0 00:18:41.693 }, 00:18:41.693 "claimed": true, 00:18:41.693 "claim_type": "exclusive_write", 00:18:41.693 "zoned": false, 00:18:41.693 "supported_io_types": { 00:18:41.693 "read": true, 00:18:41.693 "write": true, 00:18:41.693 "unmap": true, 00:18:41.693 "flush": true, 00:18:41.693 "reset": true, 00:18:41.693 "nvme_admin": false, 00:18:41.693 "nvme_io": false, 00:18:41.693 "nvme_io_md": false, 00:18:41.693 "write_zeroes": true, 00:18:41.693 "zcopy": true, 00:18:41.693 "get_zone_info": false, 00:18:41.693 "zone_management": false, 00:18:41.693 "zone_append": false, 00:18:41.693 "compare": false, 00:18:41.693 "compare_and_write": false, 00:18:41.693 "abort": true, 00:18:41.693 "seek_hole": false, 00:18:41.693 "seek_data": false, 00:18:41.693 "copy": true, 00:18:41.693 "nvme_iov_md": false 00:18:41.693 }, 00:18:41.693 "memory_domains": [ 00:18:41.693 { 00:18:41.693 "dma_device_id": "system", 00:18:41.693 "dma_device_type": 1 00:18:41.694 }, 00:18:41.694 { 00:18:41.694 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.694 "dma_device_type": 2 00:18:41.694 } 00:18:41.694 ], 00:18:41.694 "driver_specific": {} 00:18:41.694 } 00:18:41.694 ] 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.694 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:41.952 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.953 "name": "Existed_Raid", 00:18:41.953 "uuid": "63787e5a-63d7-4232-8d61-cc568d870f7f", 00:18:41.953 "strip_size_kb": 64, 00:18:41.953 "state": "configuring", 00:18:41.953 "raid_level": "raid0", 00:18:41.953 "superblock": true, 00:18:41.953 "num_base_bdevs": 4, 00:18:41.953 "num_base_bdevs_discovered": 3, 00:18:41.953 "num_base_bdevs_operational": 4, 00:18:41.953 "base_bdevs_list": [ 00:18:41.953 { 00:18:41.953 "name": "BaseBdev1", 00:18:41.953 "uuid": "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8", 00:18:41.953 "is_configured": true, 00:18:41.953 "data_offset": 2048, 00:18:41.953 "data_size": 63488 00:18:41.953 }, 00:18:41.953 { 00:18:41.953 "name": null, 00:18:41.953 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:41.953 "is_configured": false, 00:18:41.953 "data_offset": 2048, 00:18:41.953 "data_size": 63488 00:18:41.953 }, 00:18:41.953 { 00:18:41.953 "name": "BaseBdev3", 00:18:41.953 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:41.953 "is_configured": true, 00:18:41.953 "data_offset": 2048, 00:18:41.953 "data_size": 63488 00:18:41.953 }, 00:18:41.953 { 00:18:41.953 "name": "BaseBdev4", 00:18:41.953 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:41.953 "is_configured": true, 00:18:41.953 "data_offset": 2048, 00:18:41.953 "data_size": 63488 00:18:41.953 } 00:18:41.953 ] 00:18:41.953 }' 00:18:41.953 00:13:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.953 00:13:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:42.889 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:42.889 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.890 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:42.890 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:43.148 [2024-07-16 00:13:29.877017] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.148 00:13:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:43.407 00:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.407 "name": "Existed_Raid", 00:18:43.407 "uuid": "63787e5a-63d7-4232-8d61-cc568d870f7f", 00:18:43.407 "strip_size_kb": 64, 00:18:43.407 "state": "configuring", 00:18:43.407 "raid_level": "raid0", 00:18:43.407 "superblock": true, 00:18:43.407 "num_base_bdevs": 4, 00:18:43.407 "num_base_bdevs_discovered": 2, 00:18:43.407 "num_base_bdevs_operational": 4, 00:18:43.407 "base_bdevs_list": [ 00:18:43.407 { 00:18:43.407 "name": "BaseBdev1", 00:18:43.407 "uuid": "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8", 00:18:43.407 "is_configured": true, 00:18:43.407 "data_offset": 2048, 00:18:43.407 "data_size": 63488 00:18:43.407 }, 00:18:43.407 { 00:18:43.407 "name": null, 00:18:43.407 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:43.407 "is_configured": false, 00:18:43.407 "data_offset": 2048, 00:18:43.407 "data_size": 63488 00:18:43.407 }, 00:18:43.407 { 00:18:43.407 "name": null, 00:18:43.407 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:43.407 "is_configured": false, 00:18:43.407 "data_offset": 2048, 00:18:43.407 "data_size": 63488 00:18:43.407 }, 00:18:43.407 { 00:18:43.407 "name": "BaseBdev4", 00:18:43.407 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:43.407 "is_configured": true, 00:18:43.407 "data_offset": 2048, 00:18:43.407 "data_size": 63488 00:18:43.407 } 00:18:43.407 ] 00:18:43.407 }' 00:18:43.407 00:13:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.407 00:13:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:44.341 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.341 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:44.341 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:44.341 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:44.600 [2024-07-16 00:13:31.493317] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.600 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.859 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.859 "name": "Existed_Raid", 00:18:44.859 "uuid": "63787e5a-63d7-4232-8d61-cc568d870f7f", 00:18:44.859 "strip_size_kb": 64, 00:18:44.859 "state": "configuring", 00:18:44.859 "raid_level": "raid0", 00:18:44.859 "superblock": true, 00:18:44.859 "num_base_bdevs": 4, 00:18:44.859 "num_base_bdevs_discovered": 3, 00:18:44.859 "num_base_bdevs_operational": 4, 00:18:44.859 "base_bdevs_list": [ 00:18:44.859 { 00:18:44.859 "name": "BaseBdev1", 00:18:44.859 "uuid": "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8", 00:18:44.859 "is_configured": true, 00:18:44.859 "data_offset": 2048, 00:18:44.859 "data_size": 63488 00:18:44.859 }, 00:18:44.859 { 00:18:44.859 "name": null, 00:18:44.859 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:44.859 "is_configured": false, 00:18:44.859 "data_offset": 2048, 00:18:44.859 "data_size": 63488 00:18:44.859 }, 00:18:44.859 { 00:18:44.859 "name": "BaseBdev3", 00:18:44.859 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:44.859 "is_configured": true, 00:18:44.859 "data_offset": 2048, 00:18:44.859 "data_size": 63488 00:18:44.859 }, 00:18:44.859 { 00:18:44.859 "name": "BaseBdev4", 00:18:44.859 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:44.859 "is_configured": true, 00:18:44.859 "data_offset": 2048, 00:18:44.859 "data_size": 63488 00:18:44.859 } 00:18:44.859 ] 00:18:44.859 }' 00:18:44.859 00:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.859 00:13:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:45.425 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.425 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:45.683 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:45.683 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:45.941 [2024-07-16 00:13:32.712560] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.941 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:46.198 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.198 "name": "Existed_Raid", 00:18:46.198 "uuid": "63787e5a-63d7-4232-8d61-cc568d870f7f", 00:18:46.198 "strip_size_kb": 64, 00:18:46.198 "state": "configuring", 00:18:46.198 "raid_level": "raid0", 00:18:46.198 "superblock": true, 00:18:46.198 "num_base_bdevs": 4, 00:18:46.198 "num_base_bdevs_discovered": 2, 00:18:46.198 "num_base_bdevs_operational": 4, 00:18:46.198 "base_bdevs_list": [ 00:18:46.198 { 00:18:46.198 "name": null, 00:18:46.198 "uuid": "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8", 00:18:46.198 "is_configured": false, 00:18:46.198 "data_offset": 2048, 00:18:46.198 "data_size": 63488 00:18:46.198 }, 00:18:46.198 { 00:18:46.198 "name": null, 00:18:46.198 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:46.198 "is_configured": false, 00:18:46.198 "data_offset": 2048, 00:18:46.198 "data_size": 63488 00:18:46.198 }, 00:18:46.198 { 00:18:46.198 "name": "BaseBdev3", 00:18:46.198 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:46.198 "is_configured": true, 00:18:46.198 "data_offset": 2048, 00:18:46.198 "data_size": 63488 00:18:46.198 }, 00:18:46.198 { 00:18:46.198 "name": "BaseBdev4", 00:18:46.198 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:46.198 "is_configured": true, 00:18:46.198 "data_offset": 2048, 00:18:46.198 "data_size": 63488 00:18:46.198 } 00:18:46.198 ] 00:18:46.198 }' 00:18:46.198 00:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.198 00:13:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:46.763 00:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.763 00:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:47.021 00:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:47.021 00:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:47.279 [2024-07-16 00:13:34.090898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:47.279 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.280 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:47.538 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.538 "name": "Existed_Raid", 00:18:47.538 "uuid": "63787e5a-63d7-4232-8d61-cc568d870f7f", 00:18:47.538 "strip_size_kb": 64, 00:18:47.538 "state": "configuring", 00:18:47.538 "raid_level": "raid0", 00:18:47.538 "superblock": true, 00:18:47.538 "num_base_bdevs": 4, 00:18:47.538 "num_base_bdevs_discovered": 3, 00:18:47.538 "num_base_bdevs_operational": 4, 00:18:47.538 "base_bdevs_list": [ 00:18:47.538 { 00:18:47.538 "name": null, 00:18:47.538 "uuid": "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8", 00:18:47.538 "is_configured": false, 00:18:47.538 "data_offset": 2048, 00:18:47.538 "data_size": 63488 00:18:47.538 }, 00:18:47.538 { 00:18:47.538 "name": "BaseBdev2", 00:18:47.538 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:47.538 "is_configured": true, 00:18:47.538 "data_offset": 2048, 00:18:47.538 "data_size": 63488 00:18:47.538 }, 00:18:47.538 { 00:18:47.538 "name": "BaseBdev3", 00:18:47.538 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:47.538 "is_configured": true, 00:18:47.538 "data_offset": 2048, 00:18:47.538 "data_size": 63488 00:18:47.538 }, 00:18:47.538 { 00:18:47.538 "name": "BaseBdev4", 00:18:47.538 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:47.538 "is_configured": true, 00:18:47.538 "data_offset": 2048, 00:18:47.538 "data_size": 63488 00:18:47.538 } 00:18:47.538 ] 00:18:47.538 }' 00:18:47.538 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.538 00:13:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:48.167 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.167 00:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:48.424 00:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:48.424 00:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.424 00:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:48.682 00:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 929cbfb2-fa7e-4bd3-acb0-7f594f408ab8 00:18:48.940 [2024-07-16 00:13:35.694517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:48.940 [2024-07-16 00:13:35.694679] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13a4470 00:18:48.940 [2024-07-16 00:13:35.694692] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:48.940 [2024-07-16 00:13:35.694862] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1394c40 00:18:48.940 [2024-07-16 00:13:35.694989] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13a4470 00:18:48.940 [2024-07-16 00:13:35.695000] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13a4470 00:18:48.940 [2024-07-16 00:13:35.695090] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:48.940 NewBaseBdev 00:18:48.940 00:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:48.940 00:13:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:48.940 00:13:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:48.940 00:13:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:48.940 00:13:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:48.940 00:13:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:48.940 00:13:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:49.199 00:13:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:49.457 [ 00:18:49.457 { 00:18:49.457 "name": "NewBaseBdev", 00:18:49.457 "aliases": [ 00:18:49.457 "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8" 00:18:49.457 ], 00:18:49.457 "product_name": "Malloc disk", 00:18:49.457 "block_size": 512, 00:18:49.457 "num_blocks": 65536, 00:18:49.457 "uuid": "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8", 00:18:49.457 "assigned_rate_limits": { 00:18:49.457 "rw_ios_per_sec": 0, 00:18:49.457 "rw_mbytes_per_sec": 0, 00:18:49.457 "r_mbytes_per_sec": 0, 00:18:49.457 "w_mbytes_per_sec": 0 00:18:49.457 }, 00:18:49.457 "claimed": true, 00:18:49.457 "claim_type": "exclusive_write", 00:18:49.457 "zoned": false, 00:18:49.457 "supported_io_types": { 00:18:49.457 "read": true, 00:18:49.457 "write": true, 00:18:49.457 "unmap": true, 00:18:49.457 "flush": true, 00:18:49.457 "reset": true, 00:18:49.457 "nvme_admin": false, 00:18:49.457 "nvme_io": false, 00:18:49.457 "nvme_io_md": false, 00:18:49.457 "write_zeroes": true, 00:18:49.457 "zcopy": true, 00:18:49.457 "get_zone_info": false, 00:18:49.457 "zone_management": false, 00:18:49.457 "zone_append": false, 00:18:49.457 "compare": false, 00:18:49.457 "compare_and_write": false, 00:18:49.457 "abort": true, 00:18:49.457 "seek_hole": false, 00:18:49.457 "seek_data": false, 00:18:49.457 "copy": true, 00:18:49.457 "nvme_iov_md": false 00:18:49.457 }, 00:18:49.457 "memory_domains": [ 00:18:49.457 { 00:18:49.457 "dma_device_id": "system", 00:18:49.457 "dma_device_type": 1 00:18:49.457 }, 00:18:49.457 { 00:18:49.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:49.457 "dma_device_type": 2 00:18:49.457 } 00:18:49.457 ], 00:18:49.457 "driver_specific": {} 00:18:49.457 } 00:18:49.457 ] 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.457 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:49.716 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:49.716 "name": "Existed_Raid", 00:18:49.716 "uuid": "63787e5a-63d7-4232-8d61-cc568d870f7f", 00:18:49.716 "strip_size_kb": 64, 00:18:49.716 "state": "online", 00:18:49.716 "raid_level": "raid0", 00:18:49.716 "superblock": true, 00:18:49.716 "num_base_bdevs": 4, 00:18:49.716 "num_base_bdevs_discovered": 4, 00:18:49.716 "num_base_bdevs_operational": 4, 00:18:49.716 "base_bdevs_list": [ 00:18:49.716 { 00:18:49.716 "name": "NewBaseBdev", 00:18:49.716 "uuid": "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8", 00:18:49.716 "is_configured": true, 00:18:49.716 "data_offset": 2048, 00:18:49.716 "data_size": 63488 00:18:49.716 }, 00:18:49.716 { 00:18:49.716 "name": "BaseBdev2", 00:18:49.716 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:49.716 "is_configured": true, 00:18:49.716 "data_offset": 2048, 00:18:49.716 "data_size": 63488 00:18:49.716 }, 00:18:49.716 { 00:18:49.716 "name": "BaseBdev3", 00:18:49.716 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:49.716 "is_configured": true, 00:18:49.716 "data_offset": 2048, 00:18:49.716 "data_size": 63488 00:18:49.716 }, 00:18:49.716 { 00:18:49.716 "name": "BaseBdev4", 00:18:49.716 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:49.716 "is_configured": true, 00:18:49.716 "data_offset": 2048, 00:18:49.716 "data_size": 63488 00:18:49.716 } 00:18:49.716 ] 00:18:49.716 }' 00:18:49.716 00:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:49.716 00:13:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:50.281 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:50.281 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:50.281 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:50.281 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:50.281 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:50.281 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:50.281 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:50.281 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:50.539 [2024-07-16 00:13:37.243009] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:50.539 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:50.539 "name": "Existed_Raid", 00:18:50.539 "aliases": [ 00:18:50.539 "63787e5a-63d7-4232-8d61-cc568d870f7f" 00:18:50.539 ], 00:18:50.539 "product_name": "Raid Volume", 00:18:50.539 "block_size": 512, 00:18:50.539 "num_blocks": 253952, 00:18:50.539 "uuid": "63787e5a-63d7-4232-8d61-cc568d870f7f", 00:18:50.539 "assigned_rate_limits": { 00:18:50.539 "rw_ios_per_sec": 0, 00:18:50.539 "rw_mbytes_per_sec": 0, 00:18:50.539 "r_mbytes_per_sec": 0, 00:18:50.539 "w_mbytes_per_sec": 0 00:18:50.539 }, 00:18:50.539 "claimed": false, 00:18:50.539 "zoned": false, 00:18:50.539 "supported_io_types": { 00:18:50.539 "read": true, 00:18:50.539 "write": true, 00:18:50.539 "unmap": true, 00:18:50.539 "flush": true, 00:18:50.539 "reset": true, 00:18:50.539 "nvme_admin": false, 00:18:50.539 "nvme_io": false, 00:18:50.539 "nvme_io_md": false, 00:18:50.539 "write_zeroes": true, 00:18:50.539 "zcopy": false, 00:18:50.539 "get_zone_info": false, 00:18:50.539 "zone_management": false, 00:18:50.539 "zone_append": false, 00:18:50.539 "compare": false, 00:18:50.539 "compare_and_write": false, 00:18:50.539 "abort": false, 00:18:50.539 "seek_hole": false, 00:18:50.539 "seek_data": false, 00:18:50.539 "copy": false, 00:18:50.539 "nvme_iov_md": false 00:18:50.539 }, 00:18:50.539 "memory_domains": [ 00:18:50.539 { 00:18:50.539 "dma_device_id": "system", 00:18:50.539 "dma_device_type": 1 00:18:50.539 }, 00:18:50.539 { 00:18:50.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.539 "dma_device_type": 2 00:18:50.539 }, 00:18:50.539 { 00:18:50.539 "dma_device_id": "system", 00:18:50.539 "dma_device_type": 1 00:18:50.539 }, 00:18:50.539 { 00:18:50.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.539 "dma_device_type": 2 00:18:50.539 }, 00:18:50.539 { 00:18:50.539 "dma_device_id": "system", 00:18:50.539 "dma_device_type": 1 00:18:50.539 }, 00:18:50.539 { 00:18:50.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.539 "dma_device_type": 2 00:18:50.539 }, 00:18:50.539 { 00:18:50.539 "dma_device_id": "system", 00:18:50.539 "dma_device_type": 1 00:18:50.539 }, 00:18:50.539 { 00:18:50.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.539 "dma_device_type": 2 00:18:50.539 } 00:18:50.539 ], 00:18:50.539 "driver_specific": { 00:18:50.539 "raid": { 00:18:50.539 "uuid": "63787e5a-63d7-4232-8d61-cc568d870f7f", 00:18:50.539 "strip_size_kb": 64, 00:18:50.539 "state": "online", 00:18:50.539 "raid_level": "raid0", 00:18:50.539 "superblock": true, 00:18:50.539 "num_base_bdevs": 4, 00:18:50.539 "num_base_bdevs_discovered": 4, 00:18:50.539 "num_base_bdevs_operational": 4, 00:18:50.539 "base_bdevs_list": [ 00:18:50.539 { 00:18:50.539 "name": "NewBaseBdev", 00:18:50.539 "uuid": "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8", 00:18:50.539 "is_configured": true, 00:18:50.539 "data_offset": 2048, 00:18:50.539 "data_size": 63488 00:18:50.539 }, 00:18:50.539 { 00:18:50.539 "name": "BaseBdev2", 00:18:50.539 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:50.539 "is_configured": true, 00:18:50.539 "data_offset": 2048, 00:18:50.539 "data_size": 63488 00:18:50.539 }, 00:18:50.539 { 00:18:50.539 "name": "BaseBdev3", 00:18:50.539 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:50.539 "is_configured": true, 00:18:50.539 "data_offset": 2048, 00:18:50.539 "data_size": 63488 00:18:50.539 }, 00:18:50.539 { 00:18:50.539 "name": "BaseBdev4", 00:18:50.539 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:50.539 "is_configured": true, 00:18:50.539 "data_offset": 2048, 00:18:50.539 "data_size": 63488 00:18:50.539 } 00:18:50.539 ] 00:18:50.539 } 00:18:50.539 } 00:18:50.539 }' 00:18:50.539 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:50.539 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:50.539 BaseBdev2 00:18:50.539 BaseBdev3 00:18:50.539 BaseBdev4' 00:18:50.539 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:50.539 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:50.539 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:50.796 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:50.796 "name": "NewBaseBdev", 00:18:50.796 "aliases": [ 00:18:50.796 "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8" 00:18:50.796 ], 00:18:50.796 "product_name": "Malloc disk", 00:18:50.796 "block_size": 512, 00:18:50.796 "num_blocks": 65536, 00:18:50.796 "uuid": "929cbfb2-fa7e-4bd3-acb0-7f594f408ab8", 00:18:50.796 "assigned_rate_limits": { 00:18:50.796 "rw_ios_per_sec": 0, 00:18:50.796 "rw_mbytes_per_sec": 0, 00:18:50.796 "r_mbytes_per_sec": 0, 00:18:50.796 "w_mbytes_per_sec": 0 00:18:50.796 }, 00:18:50.796 "claimed": true, 00:18:50.796 "claim_type": "exclusive_write", 00:18:50.796 "zoned": false, 00:18:50.796 "supported_io_types": { 00:18:50.796 "read": true, 00:18:50.796 "write": true, 00:18:50.796 "unmap": true, 00:18:50.796 "flush": true, 00:18:50.796 "reset": true, 00:18:50.796 "nvme_admin": false, 00:18:50.796 "nvme_io": false, 00:18:50.796 "nvme_io_md": false, 00:18:50.796 "write_zeroes": true, 00:18:50.796 "zcopy": true, 00:18:50.796 "get_zone_info": false, 00:18:50.796 "zone_management": false, 00:18:50.796 "zone_append": false, 00:18:50.796 "compare": false, 00:18:50.796 "compare_and_write": false, 00:18:50.796 "abort": true, 00:18:50.796 "seek_hole": false, 00:18:50.796 "seek_data": false, 00:18:50.796 "copy": true, 00:18:50.796 "nvme_iov_md": false 00:18:50.796 }, 00:18:50.797 "memory_domains": [ 00:18:50.797 { 00:18:50.797 "dma_device_id": "system", 00:18:50.797 "dma_device_type": 1 00:18:50.797 }, 00:18:50.797 { 00:18:50.797 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.797 "dma_device_type": 2 00:18:50.797 } 00:18:50.797 ], 00:18:50.797 "driver_specific": {} 00:18:50.797 }' 00:18:50.797 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:50.797 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:50.797 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:50.797 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:50.797 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:51.054 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:51.054 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:51.054 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:51.054 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:51.054 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:51.054 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:51.054 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:51.054 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:51.054 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:51.054 00:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:51.311 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:51.311 "name": "BaseBdev2", 00:18:51.311 "aliases": [ 00:18:51.311 "a3b40150-69ab-4e47-b621-26f38d291950" 00:18:51.311 ], 00:18:51.311 "product_name": "Malloc disk", 00:18:51.311 "block_size": 512, 00:18:51.311 "num_blocks": 65536, 00:18:51.311 "uuid": "a3b40150-69ab-4e47-b621-26f38d291950", 00:18:51.311 "assigned_rate_limits": { 00:18:51.311 "rw_ios_per_sec": 0, 00:18:51.311 "rw_mbytes_per_sec": 0, 00:18:51.311 "r_mbytes_per_sec": 0, 00:18:51.311 "w_mbytes_per_sec": 0 00:18:51.311 }, 00:18:51.311 "claimed": true, 00:18:51.311 "claim_type": "exclusive_write", 00:18:51.311 "zoned": false, 00:18:51.311 "supported_io_types": { 00:18:51.311 "read": true, 00:18:51.311 "write": true, 00:18:51.311 "unmap": true, 00:18:51.311 "flush": true, 00:18:51.311 "reset": true, 00:18:51.311 "nvme_admin": false, 00:18:51.311 "nvme_io": false, 00:18:51.311 "nvme_io_md": false, 00:18:51.311 "write_zeroes": true, 00:18:51.311 "zcopy": true, 00:18:51.311 "get_zone_info": false, 00:18:51.311 "zone_management": false, 00:18:51.311 "zone_append": false, 00:18:51.311 "compare": false, 00:18:51.311 "compare_and_write": false, 00:18:51.311 "abort": true, 00:18:51.311 "seek_hole": false, 00:18:51.311 "seek_data": false, 00:18:51.311 "copy": true, 00:18:51.311 "nvme_iov_md": false 00:18:51.311 }, 00:18:51.311 "memory_domains": [ 00:18:51.311 { 00:18:51.311 "dma_device_id": "system", 00:18:51.311 "dma_device_type": 1 00:18:51.311 }, 00:18:51.311 { 00:18:51.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.311 "dma_device_type": 2 00:18:51.312 } 00:18:51.312 ], 00:18:51.312 "driver_specific": {} 00:18:51.312 }' 00:18:51.312 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:51.312 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:51.569 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:51.569 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:51.569 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:51.569 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:51.569 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:51.569 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:51.569 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:51.569 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:51.569 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:51.826 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:51.826 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:51.826 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:51.826 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:52.083 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:52.083 "name": "BaseBdev3", 00:18:52.083 "aliases": [ 00:18:52.083 "9928157b-a056-4fe7-b901-c3db0ca4f327" 00:18:52.083 ], 00:18:52.083 "product_name": "Malloc disk", 00:18:52.083 "block_size": 512, 00:18:52.083 "num_blocks": 65536, 00:18:52.083 "uuid": "9928157b-a056-4fe7-b901-c3db0ca4f327", 00:18:52.083 "assigned_rate_limits": { 00:18:52.083 "rw_ios_per_sec": 0, 00:18:52.083 "rw_mbytes_per_sec": 0, 00:18:52.083 "r_mbytes_per_sec": 0, 00:18:52.083 "w_mbytes_per_sec": 0 00:18:52.083 }, 00:18:52.083 "claimed": true, 00:18:52.083 "claim_type": "exclusive_write", 00:18:52.083 "zoned": false, 00:18:52.083 "supported_io_types": { 00:18:52.083 "read": true, 00:18:52.083 "write": true, 00:18:52.083 "unmap": true, 00:18:52.083 "flush": true, 00:18:52.083 "reset": true, 00:18:52.083 "nvme_admin": false, 00:18:52.083 "nvme_io": false, 00:18:52.083 "nvme_io_md": false, 00:18:52.083 "write_zeroes": true, 00:18:52.083 "zcopy": true, 00:18:52.083 "get_zone_info": false, 00:18:52.083 "zone_management": false, 00:18:52.083 "zone_append": false, 00:18:52.083 "compare": false, 00:18:52.083 "compare_and_write": false, 00:18:52.083 "abort": true, 00:18:52.083 "seek_hole": false, 00:18:52.083 "seek_data": false, 00:18:52.083 "copy": true, 00:18:52.083 "nvme_iov_md": false 00:18:52.083 }, 00:18:52.083 "memory_domains": [ 00:18:52.083 { 00:18:52.083 "dma_device_id": "system", 00:18:52.083 "dma_device_type": 1 00:18:52.083 }, 00:18:52.083 { 00:18:52.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.083 "dma_device_type": 2 00:18:52.083 } 00:18:52.083 ], 00:18:52.083 "driver_specific": {} 00:18:52.083 }' 00:18:52.083 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.083 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.083 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:52.083 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.083 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.083 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.083 00:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.083 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.365 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.365 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.365 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.365 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:52.365 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:52.365 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:52.365 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:52.623 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:52.623 "name": "BaseBdev4", 00:18:52.623 "aliases": [ 00:18:52.623 "893b72c7-553e-40e6-b142-f085e134d502" 00:18:52.623 ], 00:18:52.623 "product_name": "Malloc disk", 00:18:52.623 "block_size": 512, 00:18:52.623 "num_blocks": 65536, 00:18:52.623 "uuid": "893b72c7-553e-40e6-b142-f085e134d502", 00:18:52.623 "assigned_rate_limits": { 00:18:52.623 "rw_ios_per_sec": 0, 00:18:52.623 "rw_mbytes_per_sec": 0, 00:18:52.623 "r_mbytes_per_sec": 0, 00:18:52.623 "w_mbytes_per_sec": 0 00:18:52.623 }, 00:18:52.623 "claimed": true, 00:18:52.624 "claim_type": "exclusive_write", 00:18:52.624 "zoned": false, 00:18:52.624 "supported_io_types": { 00:18:52.624 "read": true, 00:18:52.624 "write": true, 00:18:52.624 "unmap": true, 00:18:52.624 "flush": true, 00:18:52.624 "reset": true, 00:18:52.624 "nvme_admin": false, 00:18:52.624 "nvme_io": false, 00:18:52.624 "nvme_io_md": false, 00:18:52.624 "write_zeroes": true, 00:18:52.624 "zcopy": true, 00:18:52.624 "get_zone_info": false, 00:18:52.624 "zone_management": false, 00:18:52.624 "zone_append": false, 00:18:52.624 "compare": false, 00:18:52.624 "compare_and_write": false, 00:18:52.624 "abort": true, 00:18:52.624 "seek_hole": false, 00:18:52.624 "seek_data": false, 00:18:52.624 "copy": true, 00:18:52.624 "nvme_iov_md": false 00:18:52.624 }, 00:18:52.624 "memory_domains": [ 00:18:52.624 { 00:18:52.624 "dma_device_id": "system", 00:18:52.624 "dma_device_type": 1 00:18:52.624 }, 00:18:52.624 { 00:18:52.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.624 "dma_device_type": 2 00:18:52.624 } 00:18:52.624 ], 00:18:52.624 "driver_specific": {} 00:18:52.624 }' 00:18:52.624 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.624 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.624 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:52.624 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.624 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.624 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.624 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.881 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.881 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.881 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.881 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.881 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:52.881 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:53.139 [2024-07-16 00:13:39.889717] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:53.139 [2024-07-16 00:13:39.889747] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:53.139 [2024-07-16 00:13:39.889794] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:53.139 [2024-07-16 00:13:39.889856] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:53.139 [2024-07-16 00:13:39.889867] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13a4470 name Existed_Raid, state offline 00:18:53.139 00:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3556118 00:18:53.139 00:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3556118 ']' 00:18:53.139 00:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3556118 00:18:53.139 00:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:53.139 00:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:53.139 00:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3556118 00:18:53.139 00:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:53.139 00:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:53.139 00:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3556118' 00:18:53.139 killing process with pid 3556118 00:18:53.140 00:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3556118 00:18:53.140 [2024-07-16 00:13:39.964001] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:53.140 00:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3556118 00:18:53.140 [2024-07-16 00:13:40.001506] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:53.398 00:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:53.398 00:18:53.398 real 0m32.409s 00:18:53.398 user 0m59.443s 00:18:53.398 sys 0m5.783s 00:18:53.398 00:13:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:53.398 00:13:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:53.398 ************************************ 00:18:53.398 END TEST raid_state_function_test_sb 00:18:53.398 ************************************ 00:18:53.398 00:13:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:53.398 00:13:40 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:18:53.398 00:13:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:53.398 00:13:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:53.398 00:13:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:53.398 ************************************ 00:18:53.398 START TEST raid_superblock_test 00:18:53.398 ************************************ 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3560996 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3560996 /var/tmp/spdk-raid.sock 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3560996 ']' 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:53.398 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:53.398 00:13:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:53.657 [2024-07-16 00:13:40.369413] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:18:53.657 [2024-07-16 00:13:40.369485] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3560996 ] 00:18:53.657 [2024-07-16 00:13:40.502245] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:53.657 [2024-07-16 00:13:40.603494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:53.915 [2024-07-16 00:13:40.669369] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:53.915 [2024-07-16 00:13:40.669409] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:54.482 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:54.482 malloc1 00:18:54.740 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:54.740 [2024-07-16 00:13:41.679076] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:54.740 [2024-07-16 00:13:41.679132] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:54.740 [2024-07-16 00:13:41.679153] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xece570 00:18:54.740 [2024-07-16 00:13:41.679165] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:54.740 [2024-07-16 00:13:41.680844] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:54.741 [2024-07-16 00:13:41.680875] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:54.741 pt1 00:18:54.999 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:54.999 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:54.999 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:54.999 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:54.999 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:54.999 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:54.999 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:54.999 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:54.999 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:54.999 malloc2 00:18:54.999 00:13:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:55.257 [2024-07-16 00:13:42.065012] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:55.257 [2024-07-16 00:13:42.065063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:55.257 [2024-07-16 00:13:42.065081] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xecf970 00:18:55.257 [2024-07-16 00:13:42.065093] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:55.257 [2024-07-16 00:13:42.066636] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:55.257 [2024-07-16 00:13:42.066667] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:55.257 pt2 00:18:55.257 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:55.257 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:55.257 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:55.257 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:55.257 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:55.257 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:55.257 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:55.257 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:55.257 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:55.515 malloc3 00:18:55.515 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:55.515 [2024-07-16 00:13:42.446804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:55.515 [2024-07-16 00:13:42.446852] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:55.515 [2024-07-16 00:13:42.446870] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1066340 00:18:55.515 [2024-07-16 00:13:42.446883] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:55.515 [2024-07-16 00:13:42.448335] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:55.515 [2024-07-16 00:13:42.448367] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:55.515 pt3 00:18:55.774 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:55.774 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:55.774 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:55.774 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:55.774 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:55.774 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:55.774 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:55.774 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:55.774 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:55.774 malloc4 00:18:55.774 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:56.032 [2024-07-16 00:13:42.836559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:56.032 [2024-07-16 00:13:42.836608] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:56.033 [2024-07-16 00:13:42.836627] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1068c60 00:18:56.033 [2024-07-16 00:13:42.836639] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:56.033 [2024-07-16 00:13:42.838041] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:56.033 [2024-07-16 00:13:42.838071] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:56.033 pt4 00:18:56.033 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:56.033 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:56.033 00:13:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:56.292 [2024-07-16 00:13:43.013044] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:56.292 [2024-07-16 00:13:43.014214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:56.292 [2024-07-16 00:13:43.014267] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:56.292 [2024-07-16 00:13:43.014309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:56.292 [2024-07-16 00:13:43.014469] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xec6530 00:18:56.292 [2024-07-16 00:13:43.014480] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:56.292 [2024-07-16 00:13:43.014659] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xec4770 00:18:56.292 [2024-07-16 00:13:43.014797] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xec6530 00:18:56.292 [2024-07-16 00:13:43.014808] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xec6530 00:18:56.292 [2024-07-16 00:13:43.014896] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.292 "name": "raid_bdev1", 00:18:56.292 "uuid": "1be5a811-4515-41e6-b363-a33f6080536e", 00:18:56.292 "strip_size_kb": 64, 00:18:56.292 "state": "online", 00:18:56.292 "raid_level": "raid0", 00:18:56.292 "superblock": true, 00:18:56.292 "num_base_bdevs": 4, 00:18:56.292 "num_base_bdevs_discovered": 4, 00:18:56.292 "num_base_bdevs_operational": 4, 00:18:56.292 "base_bdevs_list": [ 00:18:56.292 { 00:18:56.292 "name": "pt1", 00:18:56.292 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:56.292 "is_configured": true, 00:18:56.292 "data_offset": 2048, 00:18:56.292 "data_size": 63488 00:18:56.292 }, 00:18:56.292 { 00:18:56.292 "name": "pt2", 00:18:56.292 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:56.292 "is_configured": true, 00:18:56.292 "data_offset": 2048, 00:18:56.292 "data_size": 63488 00:18:56.292 }, 00:18:56.292 { 00:18:56.292 "name": "pt3", 00:18:56.292 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:56.292 "is_configured": true, 00:18:56.292 "data_offset": 2048, 00:18:56.292 "data_size": 63488 00:18:56.292 }, 00:18:56.292 { 00:18:56.292 "name": "pt4", 00:18:56.292 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:56.292 "is_configured": true, 00:18:56.292 "data_offset": 2048, 00:18:56.292 "data_size": 63488 00:18:56.292 } 00:18:56.292 ] 00:18:56.292 }' 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.292 00:13:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:57.229 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:57.229 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:57.229 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:57.229 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:57.229 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:57.229 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:57.229 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:57.229 00:13:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:57.229 [2024-07-16 00:13:44.064125] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:57.229 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:57.229 "name": "raid_bdev1", 00:18:57.229 "aliases": [ 00:18:57.229 "1be5a811-4515-41e6-b363-a33f6080536e" 00:18:57.229 ], 00:18:57.229 "product_name": "Raid Volume", 00:18:57.229 "block_size": 512, 00:18:57.229 "num_blocks": 253952, 00:18:57.229 "uuid": "1be5a811-4515-41e6-b363-a33f6080536e", 00:18:57.229 "assigned_rate_limits": { 00:18:57.229 "rw_ios_per_sec": 0, 00:18:57.229 "rw_mbytes_per_sec": 0, 00:18:57.229 "r_mbytes_per_sec": 0, 00:18:57.229 "w_mbytes_per_sec": 0 00:18:57.229 }, 00:18:57.229 "claimed": false, 00:18:57.229 "zoned": false, 00:18:57.229 "supported_io_types": { 00:18:57.229 "read": true, 00:18:57.229 "write": true, 00:18:57.229 "unmap": true, 00:18:57.229 "flush": true, 00:18:57.229 "reset": true, 00:18:57.229 "nvme_admin": false, 00:18:57.229 "nvme_io": false, 00:18:57.229 "nvme_io_md": false, 00:18:57.229 "write_zeroes": true, 00:18:57.229 "zcopy": false, 00:18:57.229 "get_zone_info": false, 00:18:57.229 "zone_management": false, 00:18:57.229 "zone_append": false, 00:18:57.229 "compare": false, 00:18:57.229 "compare_and_write": false, 00:18:57.229 "abort": false, 00:18:57.229 "seek_hole": false, 00:18:57.229 "seek_data": false, 00:18:57.229 "copy": false, 00:18:57.229 "nvme_iov_md": false 00:18:57.229 }, 00:18:57.229 "memory_domains": [ 00:18:57.229 { 00:18:57.229 "dma_device_id": "system", 00:18:57.229 "dma_device_type": 1 00:18:57.229 }, 00:18:57.229 { 00:18:57.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.229 "dma_device_type": 2 00:18:57.229 }, 00:18:57.229 { 00:18:57.229 "dma_device_id": "system", 00:18:57.229 "dma_device_type": 1 00:18:57.229 }, 00:18:57.229 { 00:18:57.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.229 "dma_device_type": 2 00:18:57.229 }, 00:18:57.229 { 00:18:57.229 "dma_device_id": "system", 00:18:57.229 "dma_device_type": 1 00:18:57.229 }, 00:18:57.229 { 00:18:57.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.229 "dma_device_type": 2 00:18:57.229 }, 00:18:57.229 { 00:18:57.229 "dma_device_id": "system", 00:18:57.229 "dma_device_type": 1 00:18:57.229 }, 00:18:57.229 { 00:18:57.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.229 "dma_device_type": 2 00:18:57.229 } 00:18:57.229 ], 00:18:57.229 "driver_specific": { 00:18:57.229 "raid": { 00:18:57.229 "uuid": "1be5a811-4515-41e6-b363-a33f6080536e", 00:18:57.229 "strip_size_kb": 64, 00:18:57.229 "state": "online", 00:18:57.229 "raid_level": "raid0", 00:18:57.229 "superblock": true, 00:18:57.229 "num_base_bdevs": 4, 00:18:57.229 "num_base_bdevs_discovered": 4, 00:18:57.229 "num_base_bdevs_operational": 4, 00:18:57.229 "base_bdevs_list": [ 00:18:57.229 { 00:18:57.229 "name": "pt1", 00:18:57.229 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:57.229 "is_configured": true, 00:18:57.229 "data_offset": 2048, 00:18:57.229 "data_size": 63488 00:18:57.229 }, 00:18:57.229 { 00:18:57.229 "name": "pt2", 00:18:57.229 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:57.229 "is_configured": true, 00:18:57.229 "data_offset": 2048, 00:18:57.229 "data_size": 63488 00:18:57.229 }, 00:18:57.229 { 00:18:57.229 "name": "pt3", 00:18:57.229 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:57.229 "is_configured": true, 00:18:57.229 "data_offset": 2048, 00:18:57.229 "data_size": 63488 00:18:57.229 }, 00:18:57.229 { 00:18:57.229 "name": "pt4", 00:18:57.229 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:57.229 "is_configured": true, 00:18:57.229 "data_offset": 2048, 00:18:57.229 "data_size": 63488 00:18:57.229 } 00:18:57.229 ] 00:18:57.229 } 00:18:57.229 } 00:18:57.229 }' 00:18:57.229 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:57.229 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:57.229 pt2 00:18:57.229 pt3 00:18:57.229 pt4' 00:18:57.229 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:57.229 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:57.229 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:57.487 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:57.487 "name": "pt1", 00:18:57.487 "aliases": [ 00:18:57.487 "00000000-0000-0000-0000-000000000001" 00:18:57.487 ], 00:18:57.487 "product_name": "passthru", 00:18:57.487 "block_size": 512, 00:18:57.487 "num_blocks": 65536, 00:18:57.487 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:57.487 "assigned_rate_limits": { 00:18:57.487 "rw_ios_per_sec": 0, 00:18:57.487 "rw_mbytes_per_sec": 0, 00:18:57.487 "r_mbytes_per_sec": 0, 00:18:57.487 "w_mbytes_per_sec": 0 00:18:57.487 }, 00:18:57.487 "claimed": true, 00:18:57.487 "claim_type": "exclusive_write", 00:18:57.487 "zoned": false, 00:18:57.487 "supported_io_types": { 00:18:57.487 "read": true, 00:18:57.487 "write": true, 00:18:57.487 "unmap": true, 00:18:57.487 "flush": true, 00:18:57.487 "reset": true, 00:18:57.487 "nvme_admin": false, 00:18:57.487 "nvme_io": false, 00:18:57.487 "nvme_io_md": false, 00:18:57.487 "write_zeroes": true, 00:18:57.487 "zcopy": true, 00:18:57.487 "get_zone_info": false, 00:18:57.487 "zone_management": false, 00:18:57.487 "zone_append": false, 00:18:57.487 "compare": false, 00:18:57.487 "compare_and_write": false, 00:18:57.487 "abort": true, 00:18:57.487 "seek_hole": false, 00:18:57.487 "seek_data": false, 00:18:57.487 "copy": true, 00:18:57.487 "nvme_iov_md": false 00:18:57.487 }, 00:18:57.487 "memory_domains": [ 00:18:57.487 { 00:18:57.487 "dma_device_id": "system", 00:18:57.487 "dma_device_type": 1 00:18:57.487 }, 00:18:57.487 { 00:18:57.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.487 "dma_device_type": 2 00:18:57.487 } 00:18:57.487 ], 00:18:57.487 "driver_specific": { 00:18:57.487 "passthru": { 00:18:57.488 "name": "pt1", 00:18:57.488 "base_bdev_name": "malloc1" 00:18:57.488 } 00:18:57.488 } 00:18:57.488 }' 00:18:57.488 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.488 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.745 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:57.745 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.745 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.745 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:57.745 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.745 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.745 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:57.745 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.003 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.003 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:58.003 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:58.003 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:58.003 00:13:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:58.262 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:58.262 "name": "pt2", 00:18:58.262 "aliases": [ 00:18:58.262 "00000000-0000-0000-0000-000000000002" 00:18:58.262 ], 00:18:58.262 "product_name": "passthru", 00:18:58.262 "block_size": 512, 00:18:58.262 "num_blocks": 65536, 00:18:58.262 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:58.262 "assigned_rate_limits": { 00:18:58.262 "rw_ios_per_sec": 0, 00:18:58.262 "rw_mbytes_per_sec": 0, 00:18:58.262 "r_mbytes_per_sec": 0, 00:18:58.262 "w_mbytes_per_sec": 0 00:18:58.262 }, 00:18:58.262 "claimed": true, 00:18:58.262 "claim_type": "exclusive_write", 00:18:58.262 "zoned": false, 00:18:58.262 "supported_io_types": { 00:18:58.262 "read": true, 00:18:58.262 "write": true, 00:18:58.262 "unmap": true, 00:18:58.262 "flush": true, 00:18:58.262 "reset": true, 00:18:58.262 "nvme_admin": false, 00:18:58.262 "nvme_io": false, 00:18:58.262 "nvme_io_md": false, 00:18:58.262 "write_zeroes": true, 00:18:58.262 "zcopy": true, 00:18:58.262 "get_zone_info": false, 00:18:58.262 "zone_management": false, 00:18:58.262 "zone_append": false, 00:18:58.262 "compare": false, 00:18:58.262 "compare_and_write": false, 00:18:58.262 "abort": true, 00:18:58.262 "seek_hole": false, 00:18:58.262 "seek_data": false, 00:18:58.262 "copy": true, 00:18:58.262 "nvme_iov_md": false 00:18:58.262 }, 00:18:58.262 "memory_domains": [ 00:18:58.262 { 00:18:58.262 "dma_device_id": "system", 00:18:58.262 "dma_device_type": 1 00:18:58.262 }, 00:18:58.262 { 00:18:58.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.262 "dma_device_type": 2 00:18:58.262 } 00:18:58.262 ], 00:18:58.262 "driver_specific": { 00:18:58.262 "passthru": { 00:18:58.262 "name": "pt2", 00:18:58.262 "base_bdev_name": "malloc2" 00:18:58.262 } 00:18:58.262 } 00:18:58.262 }' 00:18:58.262 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.262 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.262 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:58.262 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.262 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.262 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:58.262 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.521 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.521 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:58.521 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.521 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.521 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:58.521 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:58.521 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:58.521 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:58.780 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:58.780 "name": "pt3", 00:18:58.780 "aliases": [ 00:18:58.780 "00000000-0000-0000-0000-000000000003" 00:18:58.780 ], 00:18:58.780 "product_name": "passthru", 00:18:58.780 "block_size": 512, 00:18:58.780 "num_blocks": 65536, 00:18:58.780 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:58.780 "assigned_rate_limits": { 00:18:58.780 "rw_ios_per_sec": 0, 00:18:58.780 "rw_mbytes_per_sec": 0, 00:18:58.780 "r_mbytes_per_sec": 0, 00:18:58.780 "w_mbytes_per_sec": 0 00:18:58.780 }, 00:18:58.780 "claimed": true, 00:18:58.780 "claim_type": "exclusive_write", 00:18:58.780 "zoned": false, 00:18:58.780 "supported_io_types": { 00:18:58.780 "read": true, 00:18:58.780 "write": true, 00:18:58.780 "unmap": true, 00:18:58.780 "flush": true, 00:18:58.780 "reset": true, 00:18:58.780 "nvme_admin": false, 00:18:58.780 "nvme_io": false, 00:18:58.780 "nvme_io_md": false, 00:18:58.780 "write_zeroes": true, 00:18:58.780 "zcopy": true, 00:18:58.780 "get_zone_info": false, 00:18:58.780 "zone_management": false, 00:18:58.780 "zone_append": false, 00:18:58.780 "compare": false, 00:18:58.780 "compare_and_write": false, 00:18:58.780 "abort": true, 00:18:58.780 "seek_hole": false, 00:18:58.780 "seek_data": false, 00:18:58.780 "copy": true, 00:18:58.780 "nvme_iov_md": false 00:18:58.780 }, 00:18:58.780 "memory_domains": [ 00:18:58.780 { 00:18:58.780 "dma_device_id": "system", 00:18:58.780 "dma_device_type": 1 00:18:58.780 }, 00:18:58.780 { 00:18:58.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.780 "dma_device_type": 2 00:18:58.780 } 00:18:58.780 ], 00:18:58.780 "driver_specific": { 00:18:58.780 "passthru": { 00:18:58.780 "name": "pt3", 00:18:58.780 "base_bdev_name": "malloc3" 00:18:58.780 } 00:18:58.780 } 00:18:58.780 }' 00:18:58.780 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.780 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.780 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:58.780 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.780 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.039 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:59.039 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.039 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.039 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:59.039 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.039 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.039 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:59.039 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:59.039 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:59.039 00:13:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:59.299 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:59.299 "name": "pt4", 00:18:59.299 "aliases": [ 00:18:59.299 "00000000-0000-0000-0000-000000000004" 00:18:59.299 ], 00:18:59.299 "product_name": "passthru", 00:18:59.299 "block_size": 512, 00:18:59.299 "num_blocks": 65536, 00:18:59.299 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:59.299 "assigned_rate_limits": { 00:18:59.299 "rw_ios_per_sec": 0, 00:18:59.299 "rw_mbytes_per_sec": 0, 00:18:59.299 "r_mbytes_per_sec": 0, 00:18:59.299 "w_mbytes_per_sec": 0 00:18:59.299 }, 00:18:59.299 "claimed": true, 00:18:59.299 "claim_type": "exclusive_write", 00:18:59.299 "zoned": false, 00:18:59.299 "supported_io_types": { 00:18:59.299 "read": true, 00:18:59.299 "write": true, 00:18:59.299 "unmap": true, 00:18:59.299 "flush": true, 00:18:59.299 "reset": true, 00:18:59.299 "nvme_admin": false, 00:18:59.299 "nvme_io": false, 00:18:59.299 "nvme_io_md": false, 00:18:59.299 "write_zeroes": true, 00:18:59.299 "zcopy": true, 00:18:59.299 "get_zone_info": false, 00:18:59.299 "zone_management": false, 00:18:59.299 "zone_append": false, 00:18:59.299 "compare": false, 00:18:59.299 "compare_and_write": false, 00:18:59.299 "abort": true, 00:18:59.299 "seek_hole": false, 00:18:59.299 "seek_data": false, 00:18:59.299 "copy": true, 00:18:59.299 "nvme_iov_md": false 00:18:59.299 }, 00:18:59.299 "memory_domains": [ 00:18:59.299 { 00:18:59.299 "dma_device_id": "system", 00:18:59.299 "dma_device_type": 1 00:18:59.299 }, 00:18:59.299 { 00:18:59.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.299 "dma_device_type": 2 00:18:59.299 } 00:18:59.299 ], 00:18:59.299 "driver_specific": { 00:18:59.299 "passthru": { 00:18:59.299 "name": "pt4", 00:18:59.299 "base_bdev_name": "malloc4" 00:18:59.299 } 00:18:59.299 } 00:18:59.299 }' 00:18:59.299 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:59.299 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:59.299 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:59.299 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.558 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.558 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:59.558 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.558 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.558 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:59.558 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.558 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.558 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:59.558 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:59.558 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:59.818 [2024-07-16 00:13:46.630906] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:59.818 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1be5a811-4515-41e6-b363-a33f6080536e 00:18:59.818 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 1be5a811-4515-41e6-b363-a33f6080536e ']' 00:18:59.818 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:00.078 [2024-07-16 00:13:46.883274] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:00.078 [2024-07-16 00:13:46.883307] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:00.078 [2024-07-16 00:13:46.883359] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:00.078 [2024-07-16 00:13:46.883421] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:00.078 [2024-07-16 00:13:46.883432] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xec6530 name raid_bdev1, state offline 00:19:00.078 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.078 00:13:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:00.338 00:13:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:00.338 00:13:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:00.338 00:13:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:00.338 00:13:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:00.597 00:13:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:00.597 00:13:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:00.857 00:13:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:00.857 00:13:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:01.117 00:13:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:01.117 00:13:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:01.376 00:13:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:01.376 00:13:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:01.638 00:13:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:01.638 00:13:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:01.638 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:01.638 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:01.638 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:01.638 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:01.639 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:01.639 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:01.639 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:01.639 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:01.639 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:01.639 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:01.639 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:01.639 [2024-07-16 00:13:48.583730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:01.639 [2024-07-16 00:13:48.585131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:01.639 [2024-07-16 00:13:48.585177] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:01.639 [2024-07-16 00:13:48.585212] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:01.639 [2024-07-16 00:13:48.585257] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:01.639 [2024-07-16 00:13:48.585299] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:01.639 [2024-07-16 00:13:48.585321] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:01.639 [2024-07-16 00:13:48.585344] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:01.639 [2024-07-16 00:13:48.585362] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:01.639 [2024-07-16 00:13:48.585372] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1071ff0 name raid_bdev1, state configuring 00:19:01.904 request: 00:19:01.904 { 00:19:01.904 "name": "raid_bdev1", 00:19:01.904 "raid_level": "raid0", 00:19:01.904 "base_bdevs": [ 00:19:01.904 "malloc1", 00:19:01.904 "malloc2", 00:19:01.904 "malloc3", 00:19:01.904 "malloc4" 00:19:01.904 ], 00:19:01.904 "strip_size_kb": 64, 00:19:01.904 "superblock": false, 00:19:01.904 "method": "bdev_raid_create", 00:19:01.904 "req_id": 1 00:19:01.904 } 00:19:01.904 Got JSON-RPC error response 00:19:01.904 response: 00:19:01.904 { 00:19:01.904 "code": -17, 00:19:01.904 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:01.904 } 00:19:01.904 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:01.904 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:01.904 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:01.905 00:13:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:01.905 00:13:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.905 00:13:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:02.163 00:13:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:02.163 00:13:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:02.163 00:13:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:02.163 [2024-07-16 00:13:49.084965] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:02.163 [2024-07-16 00:13:49.085009] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:02.163 [2024-07-16 00:13:49.085028] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xece7a0 00:19:02.163 [2024-07-16 00:13:49.085041] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:02.163 [2024-07-16 00:13:49.086694] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:02.163 [2024-07-16 00:13:49.086729] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:02.163 [2024-07-16 00:13:49.086799] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:02.163 [2024-07-16 00:13:49.086826] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:02.163 pt1 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.163 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:02.421 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:02.421 "name": "raid_bdev1", 00:19:02.421 "uuid": "1be5a811-4515-41e6-b363-a33f6080536e", 00:19:02.421 "strip_size_kb": 64, 00:19:02.421 "state": "configuring", 00:19:02.421 "raid_level": "raid0", 00:19:02.421 "superblock": true, 00:19:02.421 "num_base_bdevs": 4, 00:19:02.421 "num_base_bdevs_discovered": 1, 00:19:02.421 "num_base_bdevs_operational": 4, 00:19:02.421 "base_bdevs_list": [ 00:19:02.421 { 00:19:02.421 "name": "pt1", 00:19:02.421 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:02.421 "is_configured": true, 00:19:02.421 "data_offset": 2048, 00:19:02.422 "data_size": 63488 00:19:02.422 }, 00:19:02.422 { 00:19:02.422 "name": null, 00:19:02.422 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:02.422 "is_configured": false, 00:19:02.422 "data_offset": 2048, 00:19:02.422 "data_size": 63488 00:19:02.422 }, 00:19:02.422 { 00:19:02.422 "name": null, 00:19:02.422 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:02.422 "is_configured": false, 00:19:02.422 "data_offset": 2048, 00:19:02.422 "data_size": 63488 00:19:02.422 }, 00:19:02.422 { 00:19:02.422 "name": null, 00:19:02.422 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:02.422 "is_configured": false, 00:19:02.422 "data_offset": 2048, 00:19:02.422 "data_size": 63488 00:19:02.422 } 00:19:02.422 ] 00:19:02.422 }' 00:19:02.422 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:02.422 00:13:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:02.988 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:19:02.988 00:13:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:03.246 [2024-07-16 00:13:50.155800] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:03.246 [2024-07-16 00:13:50.155855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:03.246 [2024-07-16 00:13:50.155874] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1067940 00:19:03.246 [2024-07-16 00:13:50.155887] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:03.246 [2024-07-16 00:13:50.156253] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:03.246 [2024-07-16 00:13:50.156276] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:03.246 [2024-07-16 00:13:50.156345] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:03.246 [2024-07-16 00:13:50.156366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:03.246 pt2 00:19:03.246 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:03.505 [2024-07-16 00:13:50.400479] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.505 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:03.764 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.764 "name": "raid_bdev1", 00:19:03.764 "uuid": "1be5a811-4515-41e6-b363-a33f6080536e", 00:19:03.764 "strip_size_kb": 64, 00:19:03.764 "state": "configuring", 00:19:03.764 "raid_level": "raid0", 00:19:03.764 "superblock": true, 00:19:03.764 "num_base_bdevs": 4, 00:19:03.764 "num_base_bdevs_discovered": 1, 00:19:03.764 "num_base_bdevs_operational": 4, 00:19:03.764 "base_bdevs_list": [ 00:19:03.764 { 00:19:03.764 "name": "pt1", 00:19:03.764 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:03.764 "is_configured": true, 00:19:03.764 "data_offset": 2048, 00:19:03.764 "data_size": 63488 00:19:03.764 }, 00:19:03.764 { 00:19:03.764 "name": null, 00:19:03.764 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:03.764 "is_configured": false, 00:19:03.764 "data_offset": 2048, 00:19:03.764 "data_size": 63488 00:19:03.764 }, 00:19:03.764 { 00:19:03.764 "name": null, 00:19:03.764 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:03.764 "is_configured": false, 00:19:03.764 "data_offset": 2048, 00:19:03.764 "data_size": 63488 00:19:03.764 }, 00:19:03.764 { 00:19:03.764 "name": null, 00:19:03.764 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:03.764 "is_configured": false, 00:19:03.764 "data_offset": 2048, 00:19:03.764 "data_size": 63488 00:19:03.764 } 00:19:03.764 ] 00:19:03.764 }' 00:19:03.764 00:13:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.764 00:13:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.699 00:13:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:04.699 00:13:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:04.699 00:13:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:04.699 [2024-07-16 00:13:51.523427] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:04.699 [2024-07-16 00:13:51.523475] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:04.699 [2024-07-16 00:13:51.523492] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec5060 00:19:04.699 [2024-07-16 00:13:51.523505] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:04.699 [2024-07-16 00:13:51.523844] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:04.699 [2024-07-16 00:13:51.523863] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:04.699 [2024-07-16 00:13:51.523937] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:04.699 [2024-07-16 00:13:51.523958] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:04.699 pt2 00:19:04.699 00:13:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:04.699 00:13:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:04.699 00:13:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:04.957 [2024-07-16 00:13:51.764063] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:04.957 [2024-07-16 00:13:51.764093] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:04.957 [2024-07-16 00:13:51.764117] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec78d0 00:19:04.957 [2024-07-16 00:13:51.764129] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:04.957 [2024-07-16 00:13:51.764404] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:04.957 [2024-07-16 00:13:51.764421] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:04.957 [2024-07-16 00:13:51.764471] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:04.957 [2024-07-16 00:13:51.764488] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:04.957 pt3 00:19:04.957 00:13:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:04.957 00:13:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:04.957 00:13:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:05.216 [2024-07-16 00:13:52.012720] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:05.216 [2024-07-16 00:13:52.012754] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:05.216 [2024-07-16 00:13:52.012769] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec8b80 00:19:05.216 [2024-07-16 00:13:52.012781] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:05.216 [2024-07-16 00:13:52.013059] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:05.216 [2024-07-16 00:13:52.013077] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:05.216 [2024-07-16 00:13:52.013126] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:05.216 [2024-07-16 00:13:52.013143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:05.216 [2024-07-16 00:13:52.013253] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xec5780 00:19:05.216 [2024-07-16 00:13:52.013264] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:05.216 [2024-07-16 00:13:52.013429] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xecad70 00:19:05.216 [2024-07-16 00:13:52.013553] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xec5780 00:19:05.216 [2024-07-16 00:13:52.013563] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xec5780 00:19:05.216 [2024-07-16 00:13:52.013657] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:05.216 pt4 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.216 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:05.474 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.474 "name": "raid_bdev1", 00:19:05.474 "uuid": "1be5a811-4515-41e6-b363-a33f6080536e", 00:19:05.474 "strip_size_kb": 64, 00:19:05.474 "state": "online", 00:19:05.474 "raid_level": "raid0", 00:19:05.474 "superblock": true, 00:19:05.474 "num_base_bdevs": 4, 00:19:05.474 "num_base_bdevs_discovered": 4, 00:19:05.474 "num_base_bdevs_operational": 4, 00:19:05.474 "base_bdevs_list": [ 00:19:05.474 { 00:19:05.474 "name": "pt1", 00:19:05.474 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:05.474 "is_configured": true, 00:19:05.474 "data_offset": 2048, 00:19:05.474 "data_size": 63488 00:19:05.474 }, 00:19:05.474 { 00:19:05.474 "name": "pt2", 00:19:05.474 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:05.474 "is_configured": true, 00:19:05.474 "data_offset": 2048, 00:19:05.474 "data_size": 63488 00:19:05.474 }, 00:19:05.474 { 00:19:05.474 "name": "pt3", 00:19:05.474 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:05.474 "is_configured": true, 00:19:05.474 "data_offset": 2048, 00:19:05.474 "data_size": 63488 00:19:05.474 }, 00:19:05.474 { 00:19:05.474 "name": "pt4", 00:19:05.474 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:05.474 "is_configured": true, 00:19:05.474 "data_offset": 2048, 00:19:05.474 "data_size": 63488 00:19:05.474 } 00:19:05.474 ] 00:19:05.474 }' 00:19:05.474 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.474 00:13:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.040 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:06.040 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:06.040 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:06.040 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:06.040 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:06.040 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:06.040 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:06.040 00:13:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:06.298 [2024-07-16 00:13:53.212206] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:06.298 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:06.298 "name": "raid_bdev1", 00:19:06.298 "aliases": [ 00:19:06.298 "1be5a811-4515-41e6-b363-a33f6080536e" 00:19:06.298 ], 00:19:06.298 "product_name": "Raid Volume", 00:19:06.298 "block_size": 512, 00:19:06.298 "num_blocks": 253952, 00:19:06.298 "uuid": "1be5a811-4515-41e6-b363-a33f6080536e", 00:19:06.298 "assigned_rate_limits": { 00:19:06.298 "rw_ios_per_sec": 0, 00:19:06.298 "rw_mbytes_per_sec": 0, 00:19:06.298 "r_mbytes_per_sec": 0, 00:19:06.298 "w_mbytes_per_sec": 0 00:19:06.298 }, 00:19:06.298 "claimed": false, 00:19:06.298 "zoned": false, 00:19:06.298 "supported_io_types": { 00:19:06.298 "read": true, 00:19:06.298 "write": true, 00:19:06.298 "unmap": true, 00:19:06.298 "flush": true, 00:19:06.298 "reset": true, 00:19:06.298 "nvme_admin": false, 00:19:06.298 "nvme_io": false, 00:19:06.298 "nvme_io_md": false, 00:19:06.298 "write_zeroes": true, 00:19:06.298 "zcopy": false, 00:19:06.298 "get_zone_info": false, 00:19:06.298 "zone_management": false, 00:19:06.298 "zone_append": false, 00:19:06.298 "compare": false, 00:19:06.298 "compare_and_write": false, 00:19:06.298 "abort": false, 00:19:06.298 "seek_hole": false, 00:19:06.298 "seek_data": false, 00:19:06.298 "copy": false, 00:19:06.298 "nvme_iov_md": false 00:19:06.298 }, 00:19:06.298 "memory_domains": [ 00:19:06.298 { 00:19:06.298 "dma_device_id": "system", 00:19:06.298 "dma_device_type": 1 00:19:06.298 }, 00:19:06.298 { 00:19:06.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.298 "dma_device_type": 2 00:19:06.298 }, 00:19:06.298 { 00:19:06.298 "dma_device_id": "system", 00:19:06.298 "dma_device_type": 1 00:19:06.298 }, 00:19:06.298 { 00:19:06.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.298 "dma_device_type": 2 00:19:06.298 }, 00:19:06.298 { 00:19:06.298 "dma_device_id": "system", 00:19:06.298 "dma_device_type": 1 00:19:06.298 }, 00:19:06.298 { 00:19:06.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.298 "dma_device_type": 2 00:19:06.298 }, 00:19:06.298 { 00:19:06.298 "dma_device_id": "system", 00:19:06.299 "dma_device_type": 1 00:19:06.299 }, 00:19:06.299 { 00:19:06.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.299 "dma_device_type": 2 00:19:06.299 } 00:19:06.299 ], 00:19:06.299 "driver_specific": { 00:19:06.299 "raid": { 00:19:06.299 "uuid": "1be5a811-4515-41e6-b363-a33f6080536e", 00:19:06.299 "strip_size_kb": 64, 00:19:06.299 "state": "online", 00:19:06.299 "raid_level": "raid0", 00:19:06.299 "superblock": true, 00:19:06.299 "num_base_bdevs": 4, 00:19:06.299 "num_base_bdevs_discovered": 4, 00:19:06.299 "num_base_bdevs_operational": 4, 00:19:06.299 "base_bdevs_list": [ 00:19:06.299 { 00:19:06.299 "name": "pt1", 00:19:06.299 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:06.299 "is_configured": true, 00:19:06.299 "data_offset": 2048, 00:19:06.299 "data_size": 63488 00:19:06.299 }, 00:19:06.299 { 00:19:06.299 "name": "pt2", 00:19:06.299 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:06.299 "is_configured": true, 00:19:06.299 "data_offset": 2048, 00:19:06.299 "data_size": 63488 00:19:06.299 }, 00:19:06.299 { 00:19:06.299 "name": "pt3", 00:19:06.299 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:06.299 "is_configured": true, 00:19:06.299 "data_offset": 2048, 00:19:06.299 "data_size": 63488 00:19:06.299 }, 00:19:06.299 { 00:19:06.299 "name": "pt4", 00:19:06.299 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:06.299 "is_configured": true, 00:19:06.299 "data_offset": 2048, 00:19:06.299 "data_size": 63488 00:19:06.299 } 00:19:06.299 ] 00:19:06.299 } 00:19:06.299 } 00:19:06.299 }' 00:19:06.299 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:06.556 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:06.556 pt2 00:19:06.556 pt3 00:19:06.556 pt4' 00:19:06.556 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:06.556 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:06.556 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:06.814 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:06.814 "name": "pt1", 00:19:06.814 "aliases": [ 00:19:06.814 "00000000-0000-0000-0000-000000000001" 00:19:06.814 ], 00:19:06.814 "product_name": "passthru", 00:19:06.814 "block_size": 512, 00:19:06.814 "num_blocks": 65536, 00:19:06.814 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:06.814 "assigned_rate_limits": { 00:19:06.814 "rw_ios_per_sec": 0, 00:19:06.814 "rw_mbytes_per_sec": 0, 00:19:06.814 "r_mbytes_per_sec": 0, 00:19:06.814 "w_mbytes_per_sec": 0 00:19:06.814 }, 00:19:06.814 "claimed": true, 00:19:06.814 "claim_type": "exclusive_write", 00:19:06.814 "zoned": false, 00:19:06.814 "supported_io_types": { 00:19:06.814 "read": true, 00:19:06.814 "write": true, 00:19:06.814 "unmap": true, 00:19:06.814 "flush": true, 00:19:06.814 "reset": true, 00:19:06.814 "nvme_admin": false, 00:19:06.814 "nvme_io": false, 00:19:06.814 "nvme_io_md": false, 00:19:06.814 "write_zeroes": true, 00:19:06.814 "zcopy": true, 00:19:06.814 "get_zone_info": false, 00:19:06.814 "zone_management": false, 00:19:06.814 "zone_append": false, 00:19:06.814 "compare": false, 00:19:06.814 "compare_and_write": false, 00:19:06.814 "abort": true, 00:19:06.814 "seek_hole": false, 00:19:06.814 "seek_data": false, 00:19:06.814 "copy": true, 00:19:06.814 "nvme_iov_md": false 00:19:06.814 }, 00:19:06.814 "memory_domains": [ 00:19:06.814 { 00:19:06.814 "dma_device_id": "system", 00:19:06.814 "dma_device_type": 1 00:19:06.814 }, 00:19:06.814 { 00:19:06.814 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.814 "dma_device_type": 2 00:19:06.814 } 00:19:06.814 ], 00:19:06.814 "driver_specific": { 00:19:06.814 "passthru": { 00:19:06.814 "name": "pt1", 00:19:06.814 "base_bdev_name": "malloc1" 00:19:06.814 } 00:19:06.814 } 00:19:06.814 }' 00:19:06.814 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:06.814 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:06.814 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:06.814 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:06.814 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:06.814 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:06.814 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:06.814 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:07.071 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:07.071 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:07.071 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:07.071 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:07.071 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:07.071 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:07.071 00:13:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:07.329 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:07.329 "name": "pt2", 00:19:07.329 "aliases": [ 00:19:07.329 "00000000-0000-0000-0000-000000000002" 00:19:07.329 ], 00:19:07.329 "product_name": "passthru", 00:19:07.329 "block_size": 512, 00:19:07.329 "num_blocks": 65536, 00:19:07.329 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:07.329 "assigned_rate_limits": { 00:19:07.329 "rw_ios_per_sec": 0, 00:19:07.329 "rw_mbytes_per_sec": 0, 00:19:07.329 "r_mbytes_per_sec": 0, 00:19:07.329 "w_mbytes_per_sec": 0 00:19:07.329 }, 00:19:07.329 "claimed": true, 00:19:07.329 "claim_type": "exclusive_write", 00:19:07.329 "zoned": false, 00:19:07.329 "supported_io_types": { 00:19:07.329 "read": true, 00:19:07.329 "write": true, 00:19:07.329 "unmap": true, 00:19:07.329 "flush": true, 00:19:07.329 "reset": true, 00:19:07.329 "nvme_admin": false, 00:19:07.329 "nvme_io": false, 00:19:07.329 "nvme_io_md": false, 00:19:07.329 "write_zeroes": true, 00:19:07.329 "zcopy": true, 00:19:07.329 "get_zone_info": false, 00:19:07.329 "zone_management": false, 00:19:07.329 "zone_append": false, 00:19:07.329 "compare": false, 00:19:07.329 "compare_and_write": false, 00:19:07.329 "abort": true, 00:19:07.329 "seek_hole": false, 00:19:07.329 "seek_data": false, 00:19:07.329 "copy": true, 00:19:07.329 "nvme_iov_md": false 00:19:07.329 }, 00:19:07.329 "memory_domains": [ 00:19:07.329 { 00:19:07.329 "dma_device_id": "system", 00:19:07.329 "dma_device_type": 1 00:19:07.329 }, 00:19:07.329 { 00:19:07.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.329 "dma_device_type": 2 00:19:07.329 } 00:19:07.329 ], 00:19:07.329 "driver_specific": { 00:19:07.329 "passthru": { 00:19:07.329 "name": "pt2", 00:19:07.329 "base_bdev_name": "malloc2" 00:19:07.329 } 00:19:07.329 } 00:19:07.329 }' 00:19:07.329 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:07.329 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:07.329 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:07.329 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:07.588 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:07.847 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:07.847 "name": "pt3", 00:19:07.847 "aliases": [ 00:19:07.847 "00000000-0000-0000-0000-000000000003" 00:19:07.847 ], 00:19:07.847 "product_name": "passthru", 00:19:07.847 "block_size": 512, 00:19:07.847 "num_blocks": 65536, 00:19:07.847 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:07.847 "assigned_rate_limits": { 00:19:07.847 "rw_ios_per_sec": 0, 00:19:07.847 "rw_mbytes_per_sec": 0, 00:19:07.847 "r_mbytes_per_sec": 0, 00:19:07.847 "w_mbytes_per_sec": 0 00:19:07.847 }, 00:19:07.847 "claimed": true, 00:19:07.847 "claim_type": "exclusive_write", 00:19:07.847 "zoned": false, 00:19:07.847 "supported_io_types": { 00:19:07.847 "read": true, 00:19:07.847 "write": true, 00:19:07.847 "unmap": true, 00:19:07.847 "flush": true, 00:19:07.847 "reset": true, 00:19:07.847 "nvme_admin": false, 00:19:07.847 "nvme_io": false, 00:19:07.847 "nvme_io_md": false, 00:19:07.847 "write_zeroes": true, 00:19:07.847 "zcopy": true, 00:19:07.847 "get_zone_info": false, 00:19:07.847 "zone_management": false, 00:19:07.847 "zone_append": false, 00:19:07.847 "compare": false, 00:19:07.847 "compare_and_write": false, 00:19:07.847 "abort": true, 00:19:07.847 "seek_hole": false, 00:19:07.847 "seek_data": false, 00:19:07.847 "copy": true, 00:19:07.847 "nvme_iov_md": false 00:19:07.847 }, 00:19:07.847 "memory_domains": [ 00:19:07.847 { 00:19:07.847 "dma_device_id": "system", 00:19:07.847 "dma_device_type": 1 00:19:07.847 }, 00:19:07.847 { 00:19:07.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.847 "dma_device_type": 2 00:19:07.847 } 00:19:07.847 ], 00:19:07.847 "driver_specific": { 00:19:07.847 "passthru": { 00:19:07.847 "name": "pt3", 00:19:07.847 "base_bdev_name": "malloc3" 00:19:07.847 } 00:19:07.847 } 00:19:07.847 }' 00:19:07.847 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:07.847 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.106 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:08.106 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.106 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.106 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:08.106 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.106 00:13:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.106 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:08.106 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.366 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.366 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:08.366 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:08.366 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:08.366 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:08.624 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:08.624 "name": "pt4", 00:19:08.624 "aliases": [ 00:19:08.624 "00000000-0000-0000-0000-000000000004" 00:19:08.624 ], 00:19:08.624 "product_name": "passthru", 00:19:08.624 "block_size": 512, 00:19:08.624 "num_blocks": 65536, 00:19:08.624 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:08.624 "assigned_rate_limits": { 00:19:08.624 "rw_ios_per_sec": 0, 00:19:08.624 "rw_mbytes_per_sec": 0, 00:19:08.624 "r_mbytes_per_sec": 0, 00:19:08.624 "w_mbytes_per_sec": 0 00:19:08.624 }, 00:19:08.624 "claimed": true, 00:19:08.624 "claim_type": "exclusive_write", 00:19:08.624 "zoned": false, 00:19:08.624 "supported_io_types": { 00:19:08.624 "read": true, 00:19:08.624 "write": true, 00:19:08.624 "unmap": true, 00:19:08.624 "flush": true, 00:19:08.624 "reset": true, 00:19:08.624 "nvme_admin": false, 00:19:08.624 "nvme_io": false, 00:19:08.624 "nvme_io_md": false, 00:19:08.624 "write_zeroes": true, 00:19:08.624 "zcopy": true, 00:19:08.624 "get_zone_info": false, 00:19:08.624 "zone_management": false, 00:19:08.624 "zone_append": false, 00:19:08.624 "compare": false, 00:19:08.624 "compare_and_write": false, 00:19:08.624 "abort": true, 00:19:08.624 "seek_hole": false, 00:19:08.624 "seek_data": false, 00:19:08.624 "copy": true, 00:19:08.624 "nvme_iov_md": false 00:19:08.624 }, 00:19:08.624 "memory_domains": [ 00:19:08.624 { 00:19:08.624 "dma_device_id": "system", 00:19:08.624 "dma_device_type": 1 00:19:08.624 }, 00:19:08.624 { 00:19:08.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.624 "dma_device_type": 2 00:19:08.624 } 00:19:08.624 ], 00:19:08.624 "driver_specific": { 00:19:08.624 "passthru": { 00:19:08.624 "name": "pt4", 00:19:08.624 "base_bdev_name": "malloc4" 00:19:08.624 } 00:19:08.624 } 00:19:08.624 }' 00:19:08.624 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.624 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.624 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:08.624 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.624 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.624 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:08.624 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.624 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.882 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:08.882 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.882 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.882 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:08.882 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:08.882 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:09.141 [2024-07-16 00:13:55.939487] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:09.141 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 1be5a811-4515-41e6-b363-a33f6080536e '!=' 1be5a811-4515-41e6-b363-a33f6080536e ']' 00:19:09.141 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:19:09.141 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:09.141 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:09.141 00:13:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3560996 00:19:09.141 00:13:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3560996 ']' 00:19:09.141 00:13:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3560996 00:19:09.141 00:13:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:19:09.141 00:13:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:09.141 00:13:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3560996 00:19:09.141 00:13:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:09.141 00:13:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:09.141 00:13:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3560996' 00:19:09.141 killing process with pid 3560996 00:19:09.141 00:13:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3560996 00:19:09.141 [2024-07-16 00:13:56.015831] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:09.141 [2024-07-16 00:13:56.015893] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:09.141 [2024-07-16 00:13:56.015961] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:09.141 [2024-07-16 00:13:56.015974] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xec5780 name raid_bdev1, state offline 00:19:09.141 00:13:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3560996 00:19:09.141 [2024-07-16 00:13:56.052123] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:09.400 00:13:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:09.400 00:19:09.400 real 0m15.954s 00:19:09.400 user 0m28.846s 00:19:09.400 sys 0m2.859s 00:19:09.400 00:13:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:09.400 00:13:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:09.400 ************************************ 00:19:09.400 END TEST raid_superblock_test 00:19:09.400 ************************************ 00:19:09.400 00:13:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:09.400 00:13:56 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:19:09.400 00:13:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:09.400 00:13:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:09.400 00:13:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:09.400 ************************************ 00:19:09.400 START TEST raid_read_error_test 00:19:09.400 ************************************ 00:19:09.400 00:13:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:19:09.400 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:09.400 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:09.401 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.OXr8NfRCvM 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3563396 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3563396 /var/tmp/spdk-raid.sock 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3563396 ']' 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:09.660 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:09.660 00:13:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:09.660 [2024-07-16 00:13:56.418782] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:19:09.660 [2024-07-16 00:13:56.418848] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3563396 ] 00:19:09.660 [2024-07-16 00:13:56.536252] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:09.919 [2024-07-16 00:13:56.640401] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:09.919 [2024-07-16 00:13:56.694628] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:09.919 [2024-07-16 00:13:56.694659] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:10.485 00:13:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:10.485 00:13:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:10.485 00:13:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:10.485 00:13:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:10.744 BaseBdev1_malloc 00:19:10.744 00:13:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:11.002 true 00:19:11.002 00:13:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:11.261 [2024-07-16 00:13:58.069931] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:11.261 [2024-07-16 00:13:58.069974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:11.261 [2024-07-16 00:13:58.069993] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18a70d0 00:19:11.261 [2024-07-16 00:13:58.070006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:11.261 [2024-07-16 00:13:58.071828] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:11.261 [2024-07-16 00:13:58.071859] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:11.261 BaseBdev1 00:19:11.261 00:13:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:11.261 00:13:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:11.518 BaseBdev2_malloc 00:19:11.518 00:13:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:11.776 true 00:19:11.776 00:13:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:12.033 [2024-07-16 00:13:58.808437] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:12.033 [2024-07-16 00:13:58.808481] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:12.033 [2024-07-16 00:13:58.808500] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18ab910 00:19:12.033 [2024-07-16 00:13:58.808513] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:12.033 [2024-07-16 00:13:58.810114] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:12.033 [2024-07-16 00:13:58.810145] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:12.033 BaseBdev2 00:19:12.033 00:13:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:12.033 00:13:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:12.291 BaseBdev3_malloc 00:19:12.291 00:13:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:12.549 true 00:19:12.549 00:13:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:12.807 [2024-07-16 00:13:59.544362] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:12.807 [2024-07-16 00:13:59.544410] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:12.807 [2024-07-16 00:13:59.544429] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18adbd0 00:19:12.807 [2024-07-16 00:13:59.544442] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:12.807 [2024-07-16 00:13:59.546048] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:12.807 [2024-07-16 00:13:59.546078] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:12.807 BaseBdev3 00:19:12.807 00:13:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:12.807 00:13:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:13.064 BaseBdev4_malloc 00:19:13.064 00:13:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:13.320 true 00:19:13.320 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:13.885 [2024-07-16 00:14:00.547639] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:13.885 [2024-07-16 00:14:00.547687] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:13.885 [2024-07-16 00:14:00.547707] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18aeaa0 00:19:13.885 [2024-07-16 00:14:00.547719] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:13.885 [2024-07-16 00:14:00.549323] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:13.885 [2024-07-16 00:14:00.549355] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:13.885 BaseBdev4 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:13.885 [2024-07-16 00:14:00.804350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:13.885 [2024-07-16 00:14:00.805731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:13.885 [2024-07-16 00:14:00.805798] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:13.885 [2024-07-16 00:14:00.805859] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:13.885 [2024-07-16 00:14:00.806103] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18a8c20 00:19:13.885 [2024-07-16 00:14:00.806115] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:13.885 [2024-07-16 00:14:00.806315] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16fd260 00:19:13.885 [2024-07-16 00:14:00.806466] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18a8c20 00:19:13.885 [2024-07-16 00:14:00.806476] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18a8c20 00:19:13.885 [2024-07-16 00:14:00.806581] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.885 00:14:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:14.143 00:14:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.143 "name": "raid_bdev1", 00:19:14.143 "uuid": "01ef0f14-4778-488b-b1ce-3169d561775e", 00:19:14.143 "strip_size_kb": 64, 00:19:14.143 "state": "online", 00:19:14.143 "raid_level": "raid0", 00:19:14.143 "superblock": true, 00:19:14.143 "num_base_bdevs": 4, 00:19:14.143 "num_base_bdevs_discovered": 4, 00:19:14.143 "num_base_bdevs_operational": 4, 00:19:14.143 "base_bdevs_list": [ 00:19:14.143 { 00:19:14.143 "name": "BaseBdev1", 00:19:14.143 "uuid": "8c47243f-e965-567c-a80a-92391c2cf520", 00:19:14.143 "is_configured": true, 00:19:14.143 "data_offset": 2048, 00:19:14.143 "data_size": 63488 00:19:14.143 }, 00:19:14.143 { 00:19:14.143 "name": "BaseBdev2", 00:19:14.143 "uuid": "d368ce23-653e-5f1a-8c68-5d59ecfe4277", 00:19:14.143 "is_configured": true, 00:19:14.143 "data_offset": 2048, 00:19:14.143 "data_size": 63488 00:19:14.143 }, 00:19:14.143 { 00:19:14.143 "name": "BaseBdev3", 00:19:14.143 "uuid": "d1694b33-a757-54a5-a597-c2b2c671ecbb", 00:19:14.143 "is_configured": true, 00:19:14.143 "data_offset": 2048, 00:19:14.143 "data_size": 63488 00:19:14.143 }, 00:19:14.143 { 00:19:14.143 "name": "BaseBdev4", 00:19:14.143 "uuid": "fe12533f-0329-5329-9a3b-448d575a2dba", 00:19:14.143 "is_configured": true, 00:19:14.143 "data_offset": 2048, 00:19:14.143 "data_size": 63488 00:19:14.143 } 00:19:14.143 ] 00:19:14.143 }' 00:19:14.143 00:14:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.143 00:14:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:15.074 00:14:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:15.074 00:14:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:15.074 [2024-07-16 00:14:01.819304] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x189afc0 00:19:16.008 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.342 00:14:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:16.342 00:14:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.342 "name": "raid_bdev1", 00:19:16.342 "uuid": "01ef0f14-4778-488b-b1ce-3169d561775e", 00:19:16.342 "strip_size_kb": 64, 00:19:16.342 "state": "online", 00:19:16.342 "raid_level": "raid0", 00:19:16.342 "superblock": true, 00:19:16.342 "num_base_bdevs": 4, 00:19:16.342 "num_base_bdevs_discovered": 4, 00:19:16.342 "num_base_bdevs_operational": 4, 00:19:16.342 "base_bdevs_list": [ 00:19:16.342 { 00:19:16.342 "name": "BaseBdev1", 00:19:16.342 "uuid": "8c47243f-e965-567c-a80a-92391c2cf520", 00:19:16.342 "is_configured": true, 00:19:16.342 "data_offset": 2048, 00:19:16.342 "data_size": 63488 00:19:16.342 }, 00:19:16.342 { 00:19:16.342 "name": "BaseBdev2", 00:19:16.342 "uuid": "d368ce23-653e-5f1a-8c68-5d59ecfe4277", 00:19:16.342 "is_configured": true, 00:19:16.342 "data_offset": 2048, 00:19:16.342 "data_size": 63488 00:19:16.342 }, 00:19:16.342 { 00:19:16.342 "name": "BaseBdev3", 00:19:16.342 "uuid": "d1694b33-a757-54a5-a597-c2b2c671ecbb", 00:19:16.342 "is_configured": true, 00:19:16.342 "data_offset": 2048, 00:19:16.342 "data_size": 63488 00:19:16.342 }, 00:19:16.342 { 00:19:16.342 "name": "BaseBdev4", 00:19:16.342 "uuid": "fe12533f-0329-5329-9a3b-448d575a2dba", 00:19:16.342 "is_configured": true, 00:19:16.342 "data_offset": 2048, 00:19:16.342 "data_size": 63488 00:19:16.342 } 00:19:16.342 ] 00:19:16.342 }' 00:19:16.342 00:14:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.342 00:14:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.276 00:14:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:17.534 [2024-07-16 00:14:04.332464] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:17.534 [2024-07-16 00:14:04.332506] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:17.534 [2024-07-16 00:14:04.335668] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:17.534 [2024-07-16 00:14:04.335705] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:17.534 [2024-07-16 00:14:04.335746] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:17.534 [2024-07-16 00:14:04.335757] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18a8c20 name raid_bdev1, state offline 00:19:17.534 0 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3563396 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3563396 ']' 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3563396 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3563396 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3563396' 00:19:17.534 killing process with pid 3563396 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3563396 00:19:17.534 [2024-07-16 00:14:04.405972] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:17.534 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3563396 00:19:17.534 [2024-07-16 00:14:04.437570] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:17.793 00:14:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.OXr8NfRCvM 00:19:17.793 00:14:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:17.793 00:14:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:17.793 00:14:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:19:17.793 00:14:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:17.793 00:14:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:17.793 00:14:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:17.793 00:14:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:19:17.793 00:19:17.793 real 0m8.336s 00:19:17.793 user 0m13.607s 00:19:17.793 sys 0m1.378s 00:19:17.793 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:17.793 00:14:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.793 ************************************ 00:19:17.793 END TEST raid_read_error_test 00:19:17.793 ************************************ 00:19:17.793 00:14:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:17.793 00:14:04 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:19:17.793 00:14:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:17.793 00:14:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:17.793 00:14:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:18.053 ************************************ 00:19:18.053 START TEST raid_write_error_test 00:19:18.053 ************************************ 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.6rXcO9dBpJ 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3564580 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3564580 /var/tmp/spdk-raid.sock 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3564580 ']' 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:18.053 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:18.053 00:14:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:18.053 [2024-07-16 00:14:04.837727] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:19:18.053 [2024-07-16 00:14:04.837796] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3564580 ] 00:19:18.053 [2024-07-16 00:14:04.965751] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:18.312 [2024-07-16 00:14:05.072790] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:18.312 [2024-07-16 00:14:05.141493] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:18.312 [2024-07-16 00:14:05.141531] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:18.880 00:14:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:18.880 00:14:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:18.880 00:14:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:18.880 00:14:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:19.138 BaseBdev1_malloc 00:19:19.138 00:14:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:19.705 true 00:19:19.705 00:14:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:20.272 [2024-07-16 00:14:07.025794] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:20.272 [2024-07-16 00:14:07.025840] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:20.272 [2024-07-16 00:14:07.025862] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1efc0d0 00:19:20.272 [2024-07-16 00:14:07.025875] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:20.272 [2024-07-16 00:14:07.027760] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:20.272 [2024-07-16 00:14:07.027793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:20.272 BaseBdev1 00:19:20.272 00:14:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:20.272 00:14:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:20.839 BaseBdev2_malloc 00:19:20.839 00:14:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:21.098 true 00:19:21.098 00:14:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:21.098 [2024-07-16 00:14:08.002192] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:21.098 [2024-07-16 00:14:08.002235] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:21.098 [2024-07-16 00:14:08.002255] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f00910 00:19:21.098 [2024-07-16 00:14:08.002268] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:21.098 [2024-07-16 00:14:08.003702] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:21.098 [2024-07-16 00:14:08.003730] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:21.098 BaseBdev2 00:19:21.098 00:14:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:21.098 00:14:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:21.665 BaseBdev3_malloc 00:19:21.665 00:14:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:22.232 true 00:19:22.232 00:14:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:22.797 [2024-07-16 00:14:09.556062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:22.797 [2024-07-16 00:14:09.556108] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:22.797 [2024-07-16 00:14:09.556127] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f02bd0 00:19:22.797 [2024-07-16 00:14:09.556140] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:22.797 [2024-07-16 00:14:09.557661] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:22.797 [2024-07-16 00:14:09.557692] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:22.797 BaseBdev3 00:19:22.797 00:14:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:22.797 00:14:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:23.363 BaseBdev4_malloc 00:19:23.363 00:14:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:23.621 true 00:19:23.621 00:14:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:24.187 [2024-07-16 00:14:10.832300] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:24.187 [2024-07-16 00:14:10.832346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:24.187 [2024-07-16 00:14:10.832367] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f03aa0 00:19:24.187 [2024-07-16 00:14:10.832379] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:24.187 [2024-07-16 00:14:10.833977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:24.187 [2024-07-16 00:14:10.834009] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:24.187 BaseBdev4 00:19:24.187 00:14:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:24.445 [2024-07-16 00:14:11.345675] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:24.445 [2024-07-16 00:14:11.347014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:24.445 [2024-07-16 00:14:11.347080] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:24.445 [2024-07-16 00:14:11.347141] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:24.445 [2024-07-16 00:14:11.347368] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1efdc20 00:19:24.445 [2024-07-16 00:14:11.347379] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:24.445 [2024-07-16 00:14:11.347576] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d52260 00:19:24.445 [2024-07-16 00:14:11.347723] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1efdc20 00:19:24.445 [2024-07-16 00:14:11.347733] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1efdc20 00:19:24.445 [2024-07-16 00:14:11.347837] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.446 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:25.012 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.012 "name": "raid_bdev1", 00:19:25.012 "uuid": "9c01baaa-5886-46db-9e07-711c6050e68c", 00:19:25.012 "strip_size_kb": 64, 00:19:25.012 "state": "online", 00:19:25.012 "raid_level": "raid0", 00:19:25.012 "superblock": true, 00:19:25.012 "num_base_bdevs": 4, 00:19:25.012 "num_base_bdevs_discovered": 4, 00:19:25.012 "num_base_bdevs_operational": 4, 00:19:25.012 "base_bdevs_list": [ 00:19:25.012 { 00:19:25.012 "name": "BaseBdev1", 00:19:25.012 "uuid": "66aa76d8-8cb2-5ca7-8f1c-af2cd9b0e427", 00:19:25.012 "is_configured": true, 00:19:25.012 "data_offset": 2048, 00:19:25.012 "data_size": 63488 00:19:25.012 }, 00:19:25.012 { 00:19:25.012 "name": "BaseBdev2", 00:19:25.012 "uuid": "6a7cc8d3-20f7-5d22-8ae5-ee4fc82ea591", 00:19:25.012 "is_configured": true, 00:19:25.012 "data_offset": 2048, 00:19:25.012 "data_size": 63488 00:19:25.012 }, 00:19:25.012 { 00:19:25.012 "name": "BaseBdev3", 00:19:25.012 "uuid": "b4f03df8-64bb-50cf-a8a4-f5fafcfa7d60", 00:19:25.012 "is_configured": true, 00:19:25.012 "data_offset": 2048, 00:19:25.012 "data_size": 63488 00:19:25.012 }, 00:19:25.012 { 00:19:25.012 "name": "BaseBdev4", 00:19:25.012 "uuid": "7c9f7a3c-7e88-5b71-a000-1038e9adde43", 00:19:25.012 "is_configured": true, 00:19:25.012 "data_offset": 2048, 00:19:25.012 "data_size": 63488 00:19:25.012 } 00:19:25.012 ] 00:19:25.012 }' 00:19:25.013 00:14:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.013 00:14:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:25.579 00:14:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:25.579 00:14:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:25.838 [2024-07-16 00:14:12.561162] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eeffc0 00:19:26.773 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:27.030 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.031 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:27.289 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.289 "name": "raid_bdev1", 00:19:27.289 "uuid": "9c01baaa-5886-46db-9e07-711c6050e68c", 00:19:27.289 "strip_size_kb": 64, 00:19:27.289 "state": "online", 00:19:27.289 "raid_level": "raid0", 00:19:27.289 "superblock": true, 00:19:27.289 "num_base_bdevs": 4, 00:19:27.289 "num_base_bdevs_discovered": 4, 00:19:27.289 "num_base_bdevs_operational": 4, 00:19:27.289 "base_bdevs_list": [ 00:19:27.289 { 00:19:27.289 "name": "BaseBdev1", 00:19:27.289 "uuid": "66aa76d8-8cb2-5ca7-8f1c-af2cd9b0e427", 00:19:27.289 "is_configured": true, 00:19:27.289 "data_offset": 2048, 00:19:27.289 "data_size": 63488 00:19:27.289 }, 00:19:27.289 { 00:19:27.289 "name": "BaseBdev2", 00:19:27.289 "uuid": "6a7cc8d3-20f7-5d22-8ae5-ee4fc82ea591", 00:19:27.289 "is_configured": true, 00:19:27.289 "data_offset": 2048, 00:19:27.289 "data_size": 63488 00:19:27.289 }, 00:19:27.289 { 00:19:27.289 "name": "BaseBdev3", 00:19:27.289 "uuid": "b4f03df8-64bb-50cf-a8a4-f5fafcfa7d60", 00:19:27.289 "is_configured": true, 00:19:27.289 "data_offset": 2048, 00:19:27.289 "data_size": 63488 00:19:27.289 }, 00:19:27.289 { 00:19:27.289 "name": "BaseBdev4", 00:19:27.289 "uuid": "7c9f7a3c-7e88-5b71-a000-1038e9adde43", 00:19:27.289 "is_configured": true, 00:19:27.289 "data_offset": 2048, 00:19:27.289 "data_size": 63488 00:19:27.289 } 00:19:27.289 ] 00:19:27.289 }' 00:19:27.289 00:14:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.289 00:14:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:27.854 00:14:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:27.854 [2024-07-16 00:14:14.802710] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:27.854 [2024-07-16 00:14:14.802743] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:28.112 [2024-07-16 00:14:14.805899] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:28.112 [2024-07-16 00:14:14.805942] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:28.112 [2024-07-16 00:14:14.805984] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:28.112 [2024-07-16 00:14:14.805995] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1efdc20 name raid_bdev1, state offline 00:19:28.112 0 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3564580 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3564580 ']' 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3564580 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3564580 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3564580' 00:19:28.112 killing process with pid 3564580 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3564580 00:19:28.112 [2024-07-16 00:14:14.886603] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:28.112 00:14:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3564580 00:19:28.112 [2024-07-16 00:14:14.918476] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:28.370 00:14:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.6rXcO9dBpJ 00:19:28.370 00:14:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:28.370 00:14:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:28.370 00:14:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:19:28.370 00:14:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:28.370 00:14:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:28.370 00:14:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:28.370 00:14:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:19:28.370 00:19:28.370 real 0m10.400s 00:19:28.370 user 0m17.317s 00:19:28.370 sys 0m1.703s 00:19:28.370 00:14:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:28.370 00:14:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:28.370 ************************************ 00:19:28.370 END TEST raid_write_error_test 00:19:28.370 ************************************ 00:19:28.370 00:14:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:28.370 00:14:15 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:28.370 00:14:15 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:19:28.370 00:14:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:28.370 00:14:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:28.370 00:14:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:28.370 ************************************ 00:19:28.370 START TEST raid_state_function_test 00:19:28.370 ************************************ 00:19:28.370 00:14:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:19:28.370 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:28.370 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:28.370 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:28.370 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:28.370 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:28.370 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:28.370 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3565942 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3565942' 00:19:28.371 Process raid pid: 3565942 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3565942 /var/tmp/spdk-raid.sock 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3565942 ']' 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:28.371 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:28.371 00:14:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:28.371 [2024-07-16 00:14:15.320820] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:19:28.371 [2024-07-16 00:14:15.320895] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:28.629 [2024-07-16 00:14:15.451946] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:28.629 [2024-07-16 00:14:15.556219] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:28.887 [2024-07-16 00:14:15.624911] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:28.887 [2024-07-16 00:14:15.624952] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:29.451 00:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:29.451 00:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:29.451 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:29.709 [2024-07-16 00:14:16.415846] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:29.709 [2024-07-16 00:14:16.415888] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:29.709 [2024-07-16 00:14:16.415898] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:29.709 [2024-07-16 00:14:16.415910] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:29.709 [2024-07-16 00:14:16.415919] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:29.709 [2024-07-16 00:14:16.415936] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:29.709 [2024-07-16 00:14:16.415945] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:29.709 [2024-07-16 00:14:16.415956] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.709 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:29.966 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:29.966 "name": "Existed_Raid", 00:19:29.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.966 "strip_size_kb": 64, 00:19:29.966 "state": "configuring", 00:19:29.966 "raid_level": "concat", 00:19:29.966 "superblock": false, 00:19:29.966 "num_base_bdevs": 4, 00:19:29.966 "num_base_bdevs_discovered": 0, 00:19:29.966 "num_base_bdevs_operational": 4, 00:19:29.966 "base_bdevs_list": [ 00:19:29.966 { 00:19:29.966 "name": "BaseBdev1", 00:19:29.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.966 "is_configured": false, 00:19:29.966 "data_offset": 0, 00:19:29.966 "data_size": 0 00:19:29.966 }, 00:19:29.966 { 00:19:29.966 "name": "BaseBdev2", 00:19:29.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.966 "is_configured": false, 00:19:29.966 "data_offset": 0, 00:19:29.966 "data_size": 0 00:19:29.966 }, 00:19:29.966 { 00:19:29.966 "name": "BaseBdev3", 00:19:29.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.966 "is_configured": false, 00:19:29.966 "data_offset": 0, 00:19:29.966 "data_size": 0 00:19:29.966 }, 00:19:29.966 { 00:19:29.966 "name": "BaseBdev4", 00:19:29.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.966 "is_configured": false, 00:19:29.966 "data_offset": 0, 00:19:29.966 "data_size": 0 00:19:29.966 } 00:19:29.966 ] 00:19:29.966 }' 00:19:29.966 00:14:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:29.966 00:14:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.532 00:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:30.532 [2024-07-16 00:14:17.414511] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:30.532 [2024-07-16 00:14:17.414542] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9b5aa0 name Existed_Raid, state configuring 00:19:30.532 00:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:30.790 [2024-07-16 00:14:17.663191] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:30.790 [2024-07-16 00:14:17.663218] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:30.790 [2024-07-16 00:14:17.663228] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:30.790 [2024-07-16 00:14:17.663239] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:30.790 [2024-07-16 00:14:17.663248] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:30.790 [2024-07-16 00:14:17.663259] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:30.790 [2024-07-16 00:14:17.663268] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:30.790 [2024-07-16 00:14:17.663279] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:30.790 00:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:31.079 [2024-07-16 00:14:17.913730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:31.079 BaseBdev1 00:19:31.079 00:14:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:31.079 00:14:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:31.079 00:14:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:31.079 00:14:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:31.079 00:14:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:31.079 00:14:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:31.079 00:14:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:31.337 00:14:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:31.620 [ 00:19:31.620 { 00:19:31.620 "name": "BaseBdev1", 00:19:31.620 "aliases": [ 00:19:31.620 "4be9f690-6167-48f1-b97a-607393671cbd" 00:19:31.620 ], 00:19:31.620 "product_name": "Malloc disk", 00:19:31.620 "block_size": 512, 00:19:31.620 "num_blocks": 65536, 00:19:31.620 "uuid": "4be9f690-6167-48f1-b97a-607393671cbd", 00:19:31.620 "assigned_rate_limits": { 00:19:31.620 "rw_ios_per_sec": 0, 00:19:31.620 "rw_mbytes_per_sec": 0, 00:19:31.620 "r_mbytes_per_sec": 0, 00:19:31.620 "w_mbytes_per_sec": 0 00:19:31.620 }, 00:19:31.620 "claimed": true, 00:19:31.620 "claim_type": "exclusive_write", 00:19:31.620 "zoned": false, 00:19:31.620 "supported_io_types": { 00:19:31.620 "read": true, 00:19:31.620 "write": true, 00:19:31.620 "unmap": true, 00:19:31.620 "flush": true, 00:19:31.620 "reset": true, 00:19:31.620 "nvme_admin": false, 00:19:31.620 "nvme_io": false, 00:19:31.620 "nvme_io_md": false, 00:19:31.620 "write_zeroes": true, 00:19:31.620 "zcopy": true, 00:19:31.620 "get_zone_info": false, 00:19:31.620 "zone_management": false, 00:19:31.620 "zone_append": false, 00:19:31.620 "compare": false, 00:19:31.620 "compare_and_write": false, 00:19:31.620 "abort": true, 00:19:31.620 "seek_hole": false, 00:19:31.620 "seek_data": false, 00:19:31.620 "copy": true, 00:19:31.620 "nvme_iov_md": false 00:19:31.620 }, 00:19:31.620 "memory_domains": [ 00:19:31.620 { 00:19:31.620 "dma_device_id": "system", 00:19:31.620 "dma_device_type": 1 00:19:31.620 }, 00:19:31.620 { 00:19:31.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:31.620 "dma_device_type": 2 00:19:31.620 } 00:19:31.620 ], 00:19:31.620 "driver_specific": {} 00:19:31.620 } 00:19:31.620 ] 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.620 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:31.879 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:31.879 "name": "Existed_Raid", 00:19:31.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.879 "strip_size_kb": 64, 00:19:31.879 "state": "configuring", 00:19:31.879 "raid_level": "concat", 00:19:31.879 "superblock": false, 00:19:31.879 "num_base_bdevs": 4, 00:19:31.879 "num_base_bdevs_discovered": 1, 00:19:31.879 "num_base_bdevs_operational": 4, 00:19:31.879 "base_bdevs_list": [ 00:19:31.879 { 00:19:31.879 "name": "BaseBdev1", 00:19:31.879 "uuid": "4be9f690-6167-48f1-b97a-607393671cbd", 00:19:31.879 "is_configured": true, 00:19:31.879 "data_offset": 0, 00:19:31.879 "data_size": 65536 00:19:31.879 }, 00:19:31.879 { 00:19:31.879 "name": "BaseBdev2", 00:19:31.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.879 "is_configured": false, 00:19:31.879 "data_offset": 0, 00:19:31.879 "data_size": 0 00:19:31.879 }, 00:19:31.879 { 00:19:31.879 "name": "BaseBdev3", 00:19:31.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.879 "is_configured": false, 00:19:31.879 "data_offset": 0, 00:19:31.879 "data_size": 0 00:19:31.879 }, 00:19:31.879 { 00:19:31.879 "name": "BaseBdev4", 00:19:31.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.879 "is_configured": false, 00:19:31.879 "data_offset": 0, 00:19:31.879 "data_size": 0 00:19:31.879 } 00:19:31.879 ] 00:19:31.879 }' 00:19:31.879 00:14:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:31.879 00:14:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:32.446 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:32.446 [2024-07-16 00:14:19.357566] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:32.446 [2024-07-16 00:14:19.357611] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9b5310 name Existed_Raid, state configuring 00:19:32.446 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:32.705 [2024-07-16 00:14:19.602257] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:32.705 [2024-07-16 00:14:19.603776] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:32.705 [2024-07-16 00:14:19.603813] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:32.705 [2024-07-16 00:14:19.603823] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:32.705 [2024-07-16 00:14:19.603835] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:32.705 [2024-07-16 00:14:19.603844] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:32.705 [2024-07-16 00:14:19.603855] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.705 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:32.964 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.964 "name": "Existed_Raid", 00:19:32.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.964 "strip_size_kb": 64, 00:19:32.964 "state": "configuring", 00:19:32.964 "raid_level": "concat", 00:19:32.964 "superblock": false, 00:19:32.964 "num_base_bdevs": 4, 00:19:32.964 "num_base_bdevs_discovered": 1, 00:19:32.964 "num_base_bdevs_operational": 4, 00:19:32.964 "base_bdevs_list": [ 00:19:32.964 { 00:19:32.964 "name": "BaseBdev1", 00:19:32.964 "uuid": "4be9f690-6167-48f1-b97a-607393671cbd", 00:19:32.964 "is_configured": true, 00:19:32.964 "data_offset": 0, 00:19:32.964 "data_size": 65536 00:19:32.964 }, 00:19:32.964 { 00:19:32.964 "name": "BaseBdev2", 00:19:32.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.964 "is_configured": false, 00:19:32.964 "data_offset": 0, 00:19:32.964 "data_size": 0 00:19:32.964 }, 00:19:32.964 { 00:19:32.964 "name": "BaseBdev3", 00:19:32.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.964 "is_configured": false, 00:19:32.964 "data_offset": 0, 00:19:32.964 "data_size": 0 00:19:32.964 }, 00:19:32.964 { 00:19:32.964 "name": "BaseBdev4", 00:19:32.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.964 "is_configured": false, 00:19:32.964 "data_offset": 0, 00:19:32.964 "data_size": 0 00:19:32.964 } 00:19:32.964 ] 00:19:32.964 }' 00:19:32.964 00:14:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.964 00:14:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.531 00:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:33.789 [2024-07-16 00:14:20.704692] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:33.789 BaseBdev2 00:19:33.789 00:14:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:33.789 00:14:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:33.789 00:14:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:33.789 00:14:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:33.789 00:14:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:33.789 00:14:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:33.789 00:14:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:34.048 00:14:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:34.307 [ 00:19:34.307 { 00:19:34.307 "name": "BaseBdev2", 00:19:34.307 "aliases": [ 00:19:34.307 "1c6a47ac-10e4-48d5-a22d-9ba8aa7e054f" 00:19:34.307 ], 00:19:34.307 "product_name": "Malloc disk", 00:19:34.307 "block_size": 512, 00:19:34.307 "num_blocks": 65536, 00:19:34.307 "uuid": "1c6a47ac-10e4-48d5-a22d-9ba8aa7e054f", 00:19:34.307 "assigned_rate_limits": { 00:19:34.307 "rw_ios_per_sec": 0, 00:19:34.307 "rw_mbytes_per_sec": 0, 00:19:34.307 "r_mbytes_per_sec": 0, 00:19:34.307 "w_mbytes_per_sec": 0 00:19:34.307 }, 00:19:34.307 "claimed": true, 00:19:34.307 "claim_type": "exclusive_write", 00:19:34.307 "zoned": false, 00:19:34.307 "supported_io_types": { 00:19:34.307 "read": true, 00:19:34.307 "write": true, 00:19:34.307 "unmap": true, 00:19:34.307 "flush": true, 00:19:34.307 "reset": true, 00:19:34.307 "nvme_admin": false, 00:19:34.307 "nvme_io": false, 00:19:34.307 "nvme_io_md": false, 00:19:34.307 "write_zeroes": true, 00:19:34.307 "zcopy": true, 00:19:34.307 "get_zone_info": false, 00:19:34.307 "zone_management": false, 00:19:34.307 "zone_append": false, 00:19:34.307 "compare": false, 00:19:34.307 "compare_and_write": false, 00:19:34.307 "abort": true, 00:19:34.307 "seek_hole": false, 00:19:34.307 "seek_data": false, 00:19:34.307 "copy": true, 00:19:34.307 "nvme_iov_md": false 00:19:34.307 }, 00:19:34.307 "memory_domains": [ 00:19:34.307 { 00:19:34.308 "dma_device_id": "system", 00:19:34.308 "dma_device_type": 1 00:19:34.308 }, 00:19:34.308 { 00:19:34.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.308 "dma_device_type": 2 00:19:34.308 } 00:19:34.308 ], 00:19:34.308 "driver_specific": {} 00:19:34.308 } 00:19:34.308 ] 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.308 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:34.567 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.567 "name": "Existed_Raid", 00:19:34.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:34.567 "strip_size_kb": 64, 00:19:34.567 "state": "configuring", 00:19:34.567 "raid_level": "concat", 00:19:34.567 "superblock": false, 00:19:34.567 "num_base_bdevs": 4, 00:19:34.567 "num_base_bdevs_discovered": 2, 00:19:34.567 "num_base_bdevs_operational": 4, 00:19:34.567 "base_bdevs_list": [ 00:19:34.567 { 00:19:34.567 "name": "BaseBdev1", 00:19:34.567 "uuid": "4be9f690-6167-48f1-b97a-607393671cbd", 00:19:34.567 "is_configured": true, 00:19:34.567 "data_offset": 0, 00:19:34.567 "data_size": 65536 00:19:34.567 }, 00:19:34.567 { 00:19:34.567 "name": "BaseBdev2", 00:19:34.567 "uuid": "1c6a47ac-10e4-48d5-a22d-9ba8aa7e054f", 00:19:34.567 "is_configured": true, 00:19:34.567 "data_offset": 0, 00:19:34.567 "data_size": 65536 00:19:34.567 }, 00:19:34.567 { 00:19:34.567 "name": "BaseBdev3", 00:19:34.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:34.567 "is_configured": false, 00:19:34.567 "data_offset": 0, 00:19:34.567 "data_size": 0 00:19:34.567 }, 00:19:34.567 { 00:19:34.567 "name": "BaseBdev4", 00:19:34.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:34.567 "is_configured": false, 00:19:34.567 "data_offset": 0, 00:19:34.567 "data_size": 0 00:19:34.567 } 00:19:34.567 ] 00:19:34.567 }' 00:19:34.567 00:14:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.567 00:14:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:35.502 00:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:35.502 [2024-07-16 00:14:22.264208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:35.502 BaseBdev3 00:19:35.502 00:14:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:35.502 00:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:35.502 00:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:35.502 00:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:35.502 00:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:35.502 00:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:35.502 00:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:36.070 00:14:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:36.328 [ 00:19:36.328 { 00:19:36.328 "name": "BaseBdev3", 00:19:36.328 "aliases": [ 00:19:36.328 "8c7d360c-6232-4b69-bdbe-4676b4fbe834" 00:19:36.328 ], 00:19:36.328 "product_name": "Malloc disk", 00:19:36.328 "block_size": 512, 00:19:36.328 "num_blocks": 65536, 00:19:36.328 "uuid": "8c7d360c-6232-4b69-bdbe-4676b4fbe834", 00:19:36.328 "assigned_rate_limits": { 00:19:36.328 "rw_ios_per_sec": 0, 00:19:36.328 "rw_mbytes_per_sec": 0, 00:19:36.328 "r_mbytes_per_sec": 0, 00:19:36.328 "w_mbytes_per_sec": 0 00:19:36.328 }, 00:19:36.328 "claimed": true, 00:19:36.328 "claim_type": "exclusive_write", 00:19:36.328 "zoned": false, 00:19:36.328 "supported_io_types": { 00:19:36.328 "read": true, 00:19:36.328 "write": true, 00:19:36.328 "unmap": true, 00:19:36.328 "flush": true, 00:19:36.328 "reset": true, 00:19:36.328 "nvme_admin": false, 00:19:36.328 "nvme_io": false, 00:19:36.328 "nvme_io_md": false, 00:19:36.328 "write_zeroes": true, 00:19:36.328 "zcopy": true, 00:19:36.328 "get_zone_info": false, 00:19:36.328 "zone_management": false, 00:19:36.328 "zone_append": false, 00:19:36.328 "compare": false, 00:19:36.328 "compare_and_write": false, 00:19:36.328 "abort": true, 00:19:36.328 "seek_hole": false, 00:19:36.328 "seek_data": false, 00:19:36.328 "copy": true, 00:19:36.328 "nvme_iov_md": false 00:19:36.328 }, 00:19:36.328 "memory_domains": [ 00:19:36.328 { 00:19:36.328 "dma_device_id": "system", 00:19:36.328 "dma_device_type": 1 00:19:36.328 }, 00:19:36.328 { 00:19:36.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.328 "dma_device_type": 2 00:19:36.328 } 00:19:36.328 ], 00:19:36.328 "driver_specific": {} 00:19:36.328 } 00:19:36.328 ] 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.328 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:36.586 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.586 "name": "Existed_Raid", 00:19:36.586 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.586 "strip_size_kb": 64, 00:19:36.586 "state": "configuring", 00:19:36.586 "raid_level": "concat", 00:19:36.586 "superblock": false, 00:19:36.586 "num_base_bdevs": 4, 00:19:36.586 "num_base_bdevs_discovered": 3, 00:19:36.586 "num_base_bdevs_operational": 4, 00:19:36.586 "base_bdevs_list": [ 00:19:36.586 { 00:19:36.586 "name": "BaseBdev1", 00:19:36.586 "uuid": "4be9f690-6167-48f1-b97a-607393671cbd", 00:19:36.586 "is_configured": true, 00:19:36.586 "data_offset": 0, 00:19:36.586 "data_size": 65536 00:19:36.586 }, 00:19:36.586 { 00:19:36.586 "name": "BaseBdev2", 00:19:36.586 "uuid": "1c6a47ac-10e4-48d5-a22d-9ba8aa7e054f", 00:19:36.586 "is_configured": true, 00:19:36.586 "data_offset": 0, 00:19:36.586 "data_size": 65536 00:19:36.586 }, 00:19:36.586 { 00:19:36.586 "name": "BaseBdev3", 00:19:36.586 "uuid": "8c7d360c-6232-4b69-bdbe-4676b4fbe834", 00:19:36.586 "is_configured": true, 00:19:36.586 "data_offset": 0, 00:19:36.586 "data_size": 65536 00:19:36.586 }, 00:19:36.586 { 00:19:36.586 "name": "BaseBdev4", 00:19:36.586 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.586 "is_configured": false, 00:19:36.586 "data_offset": 0, 00:19:36.586 "data_size": 0 00:19:36.586 } 00:19:36.586 ] 00:19:36.586 }' 00:19:36.587 00:14:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.587 00:14:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.521 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:37.521 [2024-07-16 00:14:24.389451] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:37.521 [2024-07-16 00:14:24.389491] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9b6350 00:19:37.521 [2024-07-16 00:14:24.389500] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:37.521 [2024-07-16 00:14:24.389748] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9b6020 00:19:37.521 [2024-07-16 00:14:24.389871] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9b6350 00:19:37.521 [2024-07-16 00:14:24.389881] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9b6350 00:19:37.521 [2024-07-16 00:14:24.390062] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:37.521 BaseBdev4 00:19:37.521 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:37.521 00:14:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:37.521 00:14:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:37.521 00:14:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:37.522 00:14:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:37.522 00:14:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:37.522 00:14:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:37.780 00:14:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:38.039 [ 00:19:38.039 { 00:19:38.039 "name": "BaseBdev4", 00:19:38.039 "aliases": [ 00:19:38.039 "7dedd34e-d2a9-4e70-a4fd-257a94108a4d" 00:19:38.039 ], 00:19:38.039 "product_name": "Malloc disk", 00:19:38.039 "block_size": 512, 00:19:38.039 "num_blocks": 65536, 00:19:38.039 "uuid": "7dedd34e-d2a9-4e70-a4fd-257a94108a4d", 00:19:38.039 "assigned_rate_limits": { 00:19:38.039 "rw_ios_per_sec": 0, 00:19:38.039 "rw_mbytes_per_sec": 0, 00:19:38.039 "r_mbytes_per_sec": 0, 00:19:38.039 "w_mbytes_per_sec": 0 00:19:38.039 }, 00:19:38.039 "claimed": true, 00:19:38.039 "claim_type": "exclusive_write", 00:19:38.039 "zoned": false, 00:19:38.039 "supported_io_types": { 00:19:38.039 "read": true, 00:19:38.039 "write": true, 00:19:38.039 "unmap": true, 00:19:38.039 "flush": true, 00:19:38.039 "reset": true, 00:19:38.039 "nvme_admin": false, 00:19:38.039 "nvme_io": false, 00:19:38.039 "nvme_io_md": false, 00:19:38.039 "write_zeroes": true, 00:19:38.039 "zcopy": true, 00:19:38.039 "get_zone_info": false, 00:19:38.039 "zone_management": false, 00:19:38.039 "zone_append": false, 00:19:38.039 "compare": false, 00:19:38.039 "compare_and_write": false, 00:19:38.039 "abort": true, 00:19:38.039 "seek_hole": false, 00:19:38.039 "seek_data": false, 00:19:38.039 "copy": true, 00:19:38.039 "nvme_iov_md": false 00:19:38.039 }, 00:19:38.039 "memory_domains": [ 00:19:38.039 { 00:19:38.039 "dma_device_id": "system", 00:19:38.039 "dma_device_type": 1 00:19:38.039 }, 00:19:38.039 { 00:19:38.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.039 "dma_device_type": 2 00:19:38.039 } 00:19:38.039 ], 00:19:38.039 "driver_specific": {} 00:19:38.039 } 00:19:38.039 ] 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.039 00:14:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:38.321 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:38.321 "name": "Existed_Raid", 00:19:38.321 "uuid": "64b92699-5435-4b6f-b815-544606beea8a", 00:19:38.321 "strip_size_kb": 64, 00:19:38.321 "state": "online", 00:19:38.321 "raid_level": "concat", 00:19:38.321 "superblock": false, 00:19:38.321 "num_base_bdevs": 4, 00:19:38.321 "num_base_bdevs_discovered": 4, 00:19:38.321 "num_base_bdevs_operational": 4, 00:19:38.321 "base_bdevs_list": [ 00:19:38.321 { 00:19:38.321 "name": "BaseBdev1", 00:19:38.321 "uuid": "4be9f690-6167-48f1-b97a-607393671cbd", 00:19:38.321 "is_configured": true, 00:19:38.321 "data_offset": 0, 00:19:38.321 "data_size": 65536 00:19:38.321 }, 00:19:38.321 { 00:19:38.321 "name": "BaseBdev2", 00:19:38.321 "uuid": "1c6a47ac-10e4-48d5-a22d-9ba8aa7e054f", 00:19:38.321 "is_configured": true, 00:19:38.321 "data_offset": 0, 00:19:38.321 "data_size": 65536 00:19:38.321 }, 00:19:38.321 { 00:19:38.321 "name": "BaseBdev3", 00:19:38.321 "uuid": "8c7d360c-6232-4b69-bdbe-4676b4fbe834", 00:19:38.321 "is_configured": true, 00:19:38.321 "data_offset": 0, 00:19:38.321 "data_size": 65536 00:19:38.321 }, 00:19:38.321 { 00:19:38.321 "name": "BaseBdev4", 00:19:38.321 "uuid": "7dedd34e-d2a9-4e70-a4fd-257a94108a4d", 00:19:38.321 "is_configured": true, 00:19:38.321 "data_offset": 0, 00:19:38.321 "data_size": 65536 00:19:38.321 } 00:19:38.321 ] 00:19:38.321 }' 00:19:38.321 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:38.321 00:14:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:38.889 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:38.889 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:38.889 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:38.889 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:38.890 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:38.890 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:38.890 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:38.890 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:39.149 [2024-07-16 00:14:25.921854] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:39.149 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:39.149 "name": "Existed_Raid", 00:19:39.149 "aliases": [ 00:19:39.149 "64b92699-5435-4b6f-b815-544606beea8a" 00:19:39.149 ], 00:19:39.149 "product_name": "Raid Volume", 00:19:39.149 "block_size": 512, 00:19:39.149 "num_blocks": 262144, 00:19:39.149 "uuid": "64b92699-5435-4b6f-b815-544606beea8a", 00:19:39.149 "assigned_rate_limits": { 00:19:39.149 "rw_ios_per_sec": 0, 00:19:39.149 "rw_mbytes_per_sec": 0, 00:19:39.149 "r_mbytes_per_sec": 0, 00:19:39.149 "w_mbytes_per_sec": 0 00:19:39.149 }, 00:19:39.149 "claimed": false, 00:19:39.149 "zoned": false, 00:19:39.149 "supported_io_types": { 00:19:39.149 "read": true, 00:19:39.149 "write": true, 00:19:39.149 "unmap": true, 00:19:39.149 "flush": true, 00:19:39.149 "reset": true, 00:19:39.149 "nvme_admin": false, 00:19:39.149 "nvme_io": false, 00:19:39.149 "nvme_io_md": false, 00:19:39.149 "write_zeroes": true, 00:19:39.149 "zcopy": false, 00:19:39.149 "get_zone_info": false, 00:19:39.149 "zone_management": false, 00:19:39.149 "zone_append": false, 00:19:39.149 "compare": false, 00:19:39.149 "compare_and_write": false, 00:19:39.149 "abort": false, 00:19:39.149 "seek_hole": false, 00:19:39.149 "seek_data": false, 00:19:39.149 "copy": false, 00:19:39.149 "nvme_iov_md": false 00:19:39.149 }, 00:19:39.149 "memory_domains": [ 00:19:39.149 { 00:19:39.149 "dma_device_id": "system", 00:19:39.149 "dma_device_type": 1 00:19:39.149 }, 00:19:39.149 { 00:19:39.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.149 "dma_device_type": 2 00:19:39.149 }, 00:19:39.149 { 00:19:39.149 "dma_device_id": "system", 00:19:39.149 "dma_device_type": 1 00:19:39.149 }, 00:19:39.149 { 00:19:39.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.149 "dma_device_type": 2 00:19:39.149 }, 00:19:39.149 { 00:19:39.149 "dma_device_id": "system", 00:19:39.149 "dma_device_type": 1 00:19:39.149 }, 00:19:39.149 { 00:19:39.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.149 "dma_device_type": 2 00:19:39.149 }, 00:19:39.149 { 00:19:39.149 "dma_device_id": "system", 00:19:39.149 "dma_device_type": 1 00:19:39.149 }, 00:19:39.149 { 00:19:39.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.149 "dma_device_type": 2 00:19:39.149 } 00:19:39.149 ], 00:19:39.149 "driver_specific": { 00:19:39.149 "raid": { 00:19:39.149 "uuid": "64b92699-5435-4b6f-b815-544606beea8a", 00:19:39.149 "strip_size_kb": 64, 00:19:39.149 "state": "online", 00:19:39.149 "raid_level": "concat", 00:19:39.149 "superblock": false, 00:19:39.149 "num_base_bdevs": 4, 00:19:39.149 "num_base_bdevs_discovered": 4, 00:19:39.149 "num_base_bdevs_operational": 4, 00:19:39.149 "base_bdevs_list": [ 00:19:39.149 { 00:19:39.149 "name": "BaseBdev1", 00:19:39.149 "uuid": "4be9f690-6167-48f1-b97a-607393671cbd", 00:19:39.149 "is_configured": true, 00:19:39.149 "data_offset": 0, 00:19:39.149 "data_size": 65536 00:19:39.149 }, 00:19:39.149 { 00:19:39.149 "name": "BaseBdev2", 00:19:39.149 "uuid": "1c6a47ac-10e4-48d5-a22d-9ba8aa7e054f", 00:19:39.149 "is_configured": true, 00:19:39.149 "data_offset": 0, 00:19:39.149 "data_size": 65536 00:19:39.149 }, 00:19:39.149 { 00:19:39.149 "name": "BaseBdev3", 00:19:39.149 "uuid": "8c7d360c-6232-4b69-bdbe-4676b4fbe834", 00:19:39.149 "is_configured": true, 00:19:39.149 "data_offset": 0, 00:19:39.149 "data_size": 65536 00:19:39.149 }, 00:19:39.149 { 00:19:39.149 "name": "BaseBdev4", 00:19:39.149 "uuid": "7dedd34e-d2a9-4e70-a4fd-257a94108a4d", 00:19:39.149 "is_configured": true, 00:19:39.149 "data_offset": 0, 00:19:39.149 "data_size": 65536 00:19:39.149 } 00:19:39.149 ] 00:19:39.149 } 00:19:39.149 } 00:19:39.149 }' 00:19:39.149 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:39.149 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:39.149 BaseBdev2 00:19:39.149 BaseBdev3 00:19:39.149 BaseBdev4' 00:19:39.149 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:39.149 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:39.149 00:14:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:39.409 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:39.409 "name": "BaseBdev1", 00:19:39.409 "aliases": [ 00:19:39.409 "4be9f690-6167-48f1-b97a-607393671cbd" 00:19:39.409 ], 00:19:39.409 "product_name": "Malloc disk", 00:19:39.409 "block_size": 512, 00:19:39.409 "num_blocks": 65536, 00:19:39.409 "uuid": "4be9f690-6167-48f1-b97a-607393671cbd", 00:19:39.409 "assigned_rate_limits": { 00:19:39.409 "rw_ios_per_sec": 0, 00:19:39.409 "rw_mbytes_per_sec": 0, 00:19:39.409 "r_mbytes_per_sec": 0, 00:19:39.409 "w_mbytes_per_sec": 0 00:19:39.409 }, 00:19:39.409 "claimed": true, 00:19:39.409 "claim_type": "exclusive_write", 00:19:39.409 "zoned": false, 00:19:39.409 "supported_io_types": { 00:19:39.409 "read": true, 00:19:39.409 "write": true, 00:19:39.409 "unmap": true, 00:19:39.409 "flush": true, 00:19:39.409 "reset": true, 00:19:39.409 "nvme_admin": false, 00:19:39.409 "nvme_io": false, 00:19:39.409 "nvme_io_md": false, 00:19:39.409 "write_zeroes": true, 00:19:39.409 "zcopy": true, 00:19:39.409 "get_zone_info": false, 00:19:39.409 "zone_management": false, 00:19:39.409 "zone_append": false, 00:19:39.409 "compare": false, 00:19:39.409 "compare_and_write": false, 00:19:39.409 "abort": true, 00:19:39.409 "seek_hole": false, 00:19:39.409 "seek_data": false, 00:19:39.409 "copy": true, 00:19:39.409 "nvme_iov_md": false 00:19:39.409 }, 00:19:39.409 "memory_domains": [ 00:19:39.409 { 00:19:39.409 "dma_device_id": "system", 00:19:39.409 "dma_device_type": 1 00:19:39.409 }, 00:19:39.409 { 00:19:39.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.409 "dma_device_type": 2 00:19:39.409 } 00:19:39.409 ], 00:19:39.409 "driver_specific": {} 00:19:39.409 }' 00:19:39.409 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:39.409 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:39.409 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:39.409 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:39.668 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:39.928 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:39.928 "name": "BaseBdev2", 00:19:39.928 "aliases": [ 00:19:39.928 "1c6a47ac-10e4-48d5-a22d-9ba8aa7e054f" 00:19:39.928 ], 00:19:39.928 "product_name": "Malloc disk", 00:19:39.928 "block_size": 512, 00:19:39.928 "num_blocks": 65536, 00:19:39.928 "uuid": "1c6a47ac-10e4-48d5-a22d-9ba8aa7e054f", 00:19:39.928 "assigned_rate_limits": { 00:19:39.928 "rw_ios_per_sec": 0, 00:19:39.928 "rw_mbytes_per_sec": 0, 00:19:39.928 "r_mbytes_per_sec": 0, 00:19:39.928 "w_mbytes_per_sec": 0 00:19:39.928 }, 00:19:39.928 "claimed": true, 00:19:39.928 "claim_type": "exclusive_write", 00:19:39.928 "zoned": false, 00:19:39.928 "supported_io_types": { 00:19:39.928 "read": true, 00:19:39.928 "write": true, 00:19:39.928 "unmap": true, 00:19:39.928 "flush": true, 00:19:39.928 "reset": true, 00:19:39.928 "nvme_admin": false, 00:19:39.928 "nvme_io": false, 00:19:39.928 "nvme_io_md": false, 00:19:39.928 "write_zeroes": true, 00:19:39.928 "zcopy": true, 00:19:39.928 "get_zone_info": false, 00:19:39.928 "zone_management": false, 00:19:39.928 "zone_append": false, 00:19:39.928 "compare": false, 00:19:39.928 "compare_and_write": false, 00:19:39.928 "abort": true, 00:19:39.928 "seek_hole": false, 00:19:39.928 "seek_data": false, 00:19:39.928 "copy": true, 00:19:39.928 "nvme_iov_md": false 00:19:39.928 }, 00:19:39.928 "memory_domains": [ 00:19:39.928 { 00:19:39.928 "dma_device_id": "system", 00:19:39.928 "dma_device_type": 1 00:19:39.928 }, 00:19:39.928 { 00:19:39.928 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.928 "dma_device_type": 2 00:19:39.928 } 00:19:39.928 ], 00:19:39.928 "driver_specific": {} 00:19:39.928 }' 00:19:39.928 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.186 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.186 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:40.186 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.186 00:14:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.186 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:40.186 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.186 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.186 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:40.186 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.445 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.445 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:40.445 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:40.445 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:40.445 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:40.705 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:40.705 "name": "BaseBdev3", 00:19:40.705 "aliases": [ 00:19:40.705 "8c7d360c-6232-4b69-bdbe-4676b4fbe834" 00:19:40.705 ], 00:19:40.705 "product_name": "Malloc disk", 00:19:40.705 "block_size": 512, 00:19:40.705 "num_blocks": 65536, 00:19:40.705 "uuid": "8c7d360c-6232-4b69-bdbe-4676b4fbe834", 00:19:40.705 "assigned_rate_limits": { 00:19:40.705 "rw_ios_per_sec": 0, 00:19:40.705 "rw_mbytes_per_sec": 0, 00:19:40.705 "r_mbytes_per_sec": 0, 00:19:40.705 "w_mbytes_per_sec": 0 00:19:40.705 }, 00:19:40.705 "claimed": true, 00:19:40.705 "claim_type": "exclusive_write", 00:19:40.705 "zoned": false, 00:19:40.705 "supported_io_types": { 00:19:40.705 "read": true, 00:19:40.705 "write": true, 00:19:40.705 "unmap": true, 00:19:40.705 "flush": true, 00:19:40.705 "reset": true, 00:19:40.705 "nvme_admin": false, 00:19:40.705 "nvme_io": false, 00:19:40.705 "nvme_io_md": false, 00:19:40.705 "write_zeroes": true, 00:19:40.705 "zcopy": true, 00:19:40.705 "get_zone_info": false, 00:19:40.705 "zone_management": false, 00:19:40.705 "zone_append": false, 00:19:40.705 "compare": false, 00:19:40.705 "compare_and_write": false, 00:19:40.705 "abort": true, 00:19:40.705 "seek_hole": false, 00:19:40.705 "seek_data": false, 00:19:40.705 "copy": true, 00:19:40.705 "nvme_iov_md": false 00:19:40.705 }, 00:19:40.705 "memory_domains": [ 00:19:40.705 { 00:19:40.705 "dma_device_id": "system", 00:19:40.705 "dma_device_type": 1 00:19:40.705 }, 00:19:40.705 { 00:19:40.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.705 "dma_device_type": 2 00:19:40.705 } 00:19:40.705 ], 00:19:40.705 "driver_specific": {} 00:19:40.705 }' 00:19:40.705 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.705 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.705 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:40.705 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.705 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.705 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:40.705 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.965 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.965 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:40.965 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.965 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.965 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:40.965 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:40.965 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:40.965 00:14:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:41.224 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:41.224 "name": "BaseBdev4", 00:19:41.224 "aliases": [ 00:19:41.224 "7dedd34e-d2a9-4e70-a4fd-257a94108a4d" 00:19:41.224 ], 00:19:41.224 "product_name": "Malloc disk", 00:19:41.224 "block_size": 512, 00:19:41.224 "num_blocks": 65536, 00:19:41.224 "uuid": "7dedd34e-d2a9-4e70-a4fd-257a94108a4d", 00:19:41.224 "assigned_rate_limits": { 00:19:41.224 "rw_ios_per_sec": 0, 00:19:41.224 "rw_mbytes_per_sec": 0, 00:19:41.224 "r_mbytes_per_sec": 0, 00:19:41.224 "w_mbytes_per_sec": 0 00:19:41.224 }, 00:19:41.224 "claimed": true, 00:19:41.224 "claim_type": "exclusive_write", 00:19:41.224 "zoned": false, 00:19:41.224 "supported_io_types": { 00:19:41.224 "read": true, 00:19:41.224 "write": true, 00:19:41.224 "unmap": true, 00:19:41.224 "flush": true, 00:19:41.224 "reset": true, 00:19:41.224 "nvme_admin": false, 00:19:41.224 "nvme_io": false, 00:19:41.224 "nvme_io_md": false, 00:19:41.224 "write_zeroes": true, 00:19:41.224 "zcopy": true, 00:19:41.224 "get_zone_info": false, 00:19:41.224 "zone_management": false, 00:19:41.224 "zone_append": false, 00:19:41.224 "compare": false, 00:19:41.224 "compare_and_write": false, 00:19:41.224 "abort": true, 00:19:41.224 "seek_hole": false, 00:19:41.224 "seek_data": false, 00:19:41.224 "copy": true, 00:19:41.224 "nvme_iov_md": false 00:19:41.224 }, 00:19:41.224 "memory_domains": [ 00:19:41.224 { 00:19:41.224 "dma_device_id": "system", 00:19:41.224 "dma_device_type": 1 00:19:41.224 }, 00:19:41.224 { 00:19:41.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.224 "dma_device_type": 2 00:19:41.224 } 00:19:41.224 ], 00:19:41.224 "driver_specific": {} 00:19:41.224 }' 00:19:41.224 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.224 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.224 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:41.224 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.224 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.483 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:41.483 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.483 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.484 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:41.484 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.484 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.484 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:41.484 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:41.743 [2024-07-16 00:14:28.616725] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:41.743 [2024-07-16 00:14:28.616755] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:41.743 [2024-07-16 00:14:28.616803] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:41.743 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:41.743 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:41.743 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:41.743 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:41.743 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.744 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:42.002 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:42.002 "name": "Existed_Raid", 00:19:42.002 "uuid": "64b92699-5435-4b6f-b815-544606beea8a", 00:19:42.002 "strip_size_kb": 64, 00:19:42.002 "state": "offline", 00:19:42.002 "raid_level": "concat", 00:19:42.002 "superblock": false, 00:19:42.002 "num_base_bdevs": 4, 00:19:42.002 "num_base_bdevs_discovered": 3, 00:19:42.002 "num_base_bdevs_operational": 3, 00:19:42.002 "base_bdevs_list": [ 00:19:42.002 { 00:19:42.002 "name": null, 00:19:42.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:42.002 "is_configured": false, 00:19:42.002 "data_offset": 0, 00:19:42.002 "data_size": 65536 00:19:42.002 }, 00:19:42.002 { 00:19:42.002 "name": "BaseBdev2", 00:19:42.002 "uuid": "1c6a47ac-10e4-48d5-a22d-9ba8aa7e054f", 00:19:42.002 "is_configured": true, 00:19:42.002 "data_offset": 0, 00:19:42.002 "data_size": 65536 00:19:42.002 }, 00:19:42.002 { 00:19:42.002 "name": "BaseBdev3", 00:19:42.002 "uuid": "8c7d360c-6232-4b69-bdbe-4676b4fbe834", 00:19:42.002 "is_configured": true, 00:19:42.002 "data_offset": 0, 00:19:42.002 "data_size": 65536 00:19:42.003 }, 00:19:42.003 { 00:19:42.003 "name": "BaseBdev4", 00:19:42.003 "uuid": "7dedd34e-d2a9-4e70-a4fd-257a94108a4d", 00:19:42.003 "is_configured": true, 00:19:42.003 "data_offset": 0, 00:19:42.003 "data_size": 65536 00:19:42.003 } 00:19:42.003 ] 00:19:42.003 }' 00:19:42.003 00:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:42.003 00:14:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.570 00:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:42.570 00:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:42.570 00:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:42.570 00:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.829 00:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:42.829 00:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:42.829 00:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:43.398 [2024-07-16 00:14:30.226148] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:43.398 00:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:43.398 00:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:43.398 00:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.398 00:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:43.657 00:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:43.657 00:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:43.657 00:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:44.224 [2024-07-16 00:14:30.998791] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:44.224 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:44.224 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:44.224 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.224 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:44.482 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:44.482 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:44.482 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:44.740 [2024-07-16 00:14:31.512511] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:44.740 [2024-07-16 00:14:31.512559] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9b6350 name Existed_Raid, state offline 00:19:44.740 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:44.740 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:44.740 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.740 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:44.999 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:44.999 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:44.999 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:44.999 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:44.999 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:44.999 00:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:45.258 BaseBdev2 00:19:45.258 00:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:45.258 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:45.258 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:45.258 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:45.258 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:45.258 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:45.258 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:45.562 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:45.820 [ 00:19:45.820 { 00:19:45.820 "name": "BaseBdev2", 00:19:45.820 "aliases": [ 00:19:45.820 "04e5b3ed-26ea-4004-8fe0-7d6d64433168" 00:19:45.820 ], 00:19:45.820 "product_name": "Malloc disk", 00:19:45.820 "block_size": 512, 00:19:45.820 "num_blocks": 65536, 00:19:45.820 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:19:45.820 "assigned_rate_limits": { 00:19:45.820 "rw_ios_per_sec": 0, 00:19:45.820 "rw_mbytes_per_sec": 0, 00:19:45.820 "r_mbytes_per_sec": 0, 00:19:45.820 "w_mbytes_per_sec": 0 00:19:45.820 }, 00:19:45.820 "claimed": false, 00:19:45.820 "zoned": false, 00:19:45.820 "supported_io_types": { 00:19:45.820 "read": true, 00:19:45.820 "write": true, 00:19:45.820 "unmap": true, 00:19:45.820 "flush": true, 00:19:45.820 "reset": true, 00:19:45.820 "nvme_admin": false, 00:19:45.820 "nvme_io": false, 00:19:45.820 "nvme_io_md": false, 00:19:45.820 "write_zeroes": true, 00:19:45.820 "zcopy": true, 00:19:45.820 "get_zone_info": false, 00:19:45.820 "zone_management": false, 00:19:45.820 "zone_append": false, 00:19:45.820 "compare": false, 00:19:45.820 "compare_and_write": false, 00:19:45.820 "abort": true, 00:19:45.820 "seek_hole": false, 00:19:45.820 "seek_data": false, 00:19:45.820 "copy": true, 00:19:45.820 "nvme_iov_md": false 00:19:45.820 }, 00:19:45.820 "memory_domains": [ 00:19:45.820 { 00:19:45.820 "dma_device_id": "system", 00:19:45.820 "dma_device_type": 1 00:19:45.820 }, 00:19:45.820 { 00:19:45.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.820 "dma_device_type": 2 00:19:45.820 } 00:19:45.820 ], 00:19:45.820 "driver_specific": {} 00:19:45.820 } 00:19:45.820 ] 00:19:45.820 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:45.820 00:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:45.820 00:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:45.820 00:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:46.079 BaseBdev3 00:19:46.079 00:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:46.079 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:46.079 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:46.079 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:46.079 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:46.079 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:46.079 00:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:46.338 00:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:46.596 [ 00:19:46.596 { 00:19:46.596 "name": "BaseBdev3", 00:19:46.596 "aliases": [ 00:19:46.596 "b7910928-4a7f-459b-aa3a-3b8a9126a034" 00:19:46.596 ], 00:19:46.596 "product_name": "Malloc disk", 00:19:46.596 "block_size": 512, 00:19:46.596 "num_blocks": 65536, 00:19:46.596 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:19:46.596 "assigned_rate_limits": { 00:19:46.596 "rw_ios_per_sec": 0, 00:19:46.596 "rw_mbytes_per_sec": 0, 00:19:46.596 "r_mbytes_per_sec": 0, 00:19:46.596 "w_mbytes_per_sec": 0 00:19:46.596 }, 00:19:46.596 "claimed": false, 00:19:46.596 "zoned": false, 00:19:46.596 "supported_io_types": { 00:19:46.596 "read": true, 00:19:46.596 "write": true, 00:19:46.596 "unmap": true, 00:19:46.596 "flush": true, 00:19:46.596 "reset": true, 00:19:46.596 "nvme_admin": false, 00:19:46.596 "nvme_io": false, 00:19:46.596 "nvme_io_md": false, 00:19:46.596 "write_zeroes": true, 00:19:46.596 "zcopy": true, 00:19:46.596 "get_zone_info": false, 00:19:46.596 "zone_management": false, 00:19:46.596 "zone_append": false, 00:19:46.596 "compare": false, 00:19:46.596 "compare_and_write": false, 00:19:46.596 "abort": true, 00:19:46.596 "seek_hole": false, 00:19:46.596 "seek_data": false, 00:19:46.596 "copy": true, 00:19:46.596 "nvme_iov_md": false 00:19:46.596 }, 00:19:46.596 "memory_domains": [ 00:19:46.596 { 00:19:46.596 "dma_device_id": "system", 00:19:46.596 "dma_device_type": 1 00:19:46.596 }, 00:19:46.596 { 00:19:46.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.596 "dma_device_type": 2 00:19:46.596 } 00:19:46.596 ], 00:19:46.596 "driver_specific": {} 00:19:46.596 } 00:19:46.596 ] 00:19:46.596 00:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:46.596 00:14:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:46.596 00:14:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:46.596 00:14:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:46.854 BaseBdev4 00:19:46.854 00:14:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:46.854 00:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:46.854 00:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:46.854 00:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:46.854 00:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:46.854 00:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:46.854 00:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:47.112 00:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:47.112 [ 00:19:47.112 { 00:19:47.112 "name": "BaseBdev4", 00:19:47.112 "aliases": [ 00:19:47.112 "1423ac75-33d8-4681-a56b-e119b4984698" 00:19:47.112 ], 00:19:47.112 "product_name": "Malloc disk", 00:19:47.112 "block_size": 512, 00:19:47.112 "num_blocks": 65536, 00:19:47.112 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:19:47.112 "assigned_rate_limits": { 00:19:47.112 "rw_ios_per_sec": 0, 00:19:47.112 "rw_mbytes_per_sec": 0, 00:19:47.112 "r_mbytes_per_sec": 0, 00:19:47.112 "w_mbytes_per_sec": 0 00:19:47.112 }, 00:19:47.112 "claimed": false, 00:19:47.112 "zoned": false, 00:19:47.112 "supported_io_types": { 00:19:47.112 "read": true, 00:19:47.112 "write": true, 00:19:47.112 "unmap": true, 00:19:47.112 "flush": true, 00:19:47.112 "reset": true, 00:19:47.113 "nvme_admin": false, 00:19:47.113 "nvme_io": false, 00:19:47.113 "nvme_io_md": false, 00:19:47.113 "write_zeroes": true, 00:19:47.113 "zcopy": true, 00:19:47.113 "get_zone_info": false, 00:19:47.113 "zone_management": false, 00:19:47.113 "zone_append": false, 00:19:47.113 "compare": false, 00:19:47.113 "compare_and_write": false, 00:19:47.113 "abort": true, 00:19:47.113 "seek_hole": false, 00:19:47.113 "seek_data": false, 00:19:47.113 "copy": true, 00:19:47.113 "nvme_iov_md": false 00:19:47.113 }, 00:19:47.113 "memory_domains": [ 00:19:47.113 { 00:19:47.113 "dma_device_id": "system", 00:19:47.113 "dma_device_type": 1 00:19:47.113 }, 00:19:47.113 { 00:19:47.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.113 "dma_device_type": 2 00:19:47.113 } 00:19:47.113 ], 00:19:47.113 "driver_specific": {} 00:19:47.113 } 00:19:47.113 ] 00:19:47.371 00:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:47.371 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:47.371 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:47.371 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:47.629 [2024-07-16 00:14:34.563112] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:47.629 [2024-07-16 00:14:34.563158] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:47.629 [2024-07-16 00:14:34.563178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:47.629 [2024-07-16 00:14:34.564546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:47.629 [2024-07-16 00:14:34.564586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.886 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:48.143 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:48.143 "name": "Existed_Raid", 00:19:48.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.143 "strip_size_kb": 64, 00:19:48.143 "state": "configuring", 00:19:48.143 "raid_level": "concat", 00:19:48.143 "superblock": false, 00:19:48.143 "num_base_bdevs": 4, 00:19:48.143 "num_base_bdevs_discovered": 3, 00:19:48.143 "num_base_bdevs_operational": 4, 00:19:48.143 "base_bdevs_list": [ 00:19:48.143 { 00:19:48.143 "name": "BaseBdev1", 00:19:48.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.143 "is_configured": false, 00:19:48.143 "data_offset": 0, 00:19:48.143 "data_size": 0 00:19:48.143 }, 00:19:48.143 { 00:19:48.143 "name": "BaseBdev2", 00:19:48.144 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:19:48.144 "is_configured": true, 00:19:48.144 "data_offset": 0, 00:19:48.144 "data_size": 65536 00:19:48.144 }, 00:19:48.144 { 00:19:48.144 "name": "BaseBdev3", 00:19:48.144 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:19:48.144 "is_configured": true, 00:19:48.144 "data_offset": 0, 00:19:48.144 "data_size": 65536 00:19:48.144 }, 00:19:48.144 { 00:19:48.144 "name": "BaseBdev4", 00:19:48.144 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:19:48.144 "is_configured": true, 00:19:48.144 "data_offset": 0, 00:19:48.144 "data_size": 65536 00:19:48.144 } 00:19:48.144 ] 00:19:48.144 }' 00:19:48.144 00:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:48.144 00:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:48.709 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:48.966 [2024-07-16 00:14:35.686037] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.966 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:49.224 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:49.224 "name": "Existed_Raid", 00:19:49.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.224 "strip_size_kb": 64, 00:19:49.224 "state": "configuring", 00:19:49.224 "raid_level": "concat", 00:19:49.224 "superblock": false, 00:19:49.224 "num_base_bdevs": 4, 00:19:49.224 "num_base_bdevs_discovered": 2, 00:19:49.224 "num_base_bdevs_operational": 4, 00:19:49.224 "base_bdevs_list": [ 00:19:49.224 { 00:19:49.224 "name": "BaseBdev1", 00:19:49.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.224 "is_configured": false, 00:19:49.224 "data_offset": 0, 00:19:49.224 "data_size": 0 00:19:49.224 }, 00:19:49.224 { 00:19:49.224 "name": null, 00:19:49.224 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:19:49.224 "is_configured": false, 00:19:49.224 "data_offset": 0, 00:19:49.224 "data_size": 65536 00:19:49.224 }, 00:19:49.224 { 00:19:49.224 "name": "BaseBdev3", 00:19:49.224 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:19:49.224 "is_configured": true, 00:19:49.224 "data_offset": 0, 00:19:49.224 "data_size": 65536 00:19:49.224 }, 00:19:49.224 { 00:19:49.224 "name": "BaseBdev4", 00:19:49.224 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:19:49.224 "is_configured": true, 00:19:49.224 "data_offset": 0, 00:19:49.224 "data_size": 65536 00:19:49.224 } 00:19:49.224 ] 00:19:49.224 }' 00:19:49.224 00:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:49.224 00:14:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.789 00:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.789 00:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:50.047 00:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:50.047 00:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:50.047 [2024-07-16 00:14:36.984979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:50.047 BaseBdev1 00:19:50.306 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:50.306 00:14:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:50.306 00:14:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:50.306 00:14:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:50.306 00:14:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:50.306 00:14:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:50.306 00:14:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:50.306 00:14:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:50.564 [ 00:19:50.564 { 00:19:50.564 "name": "BaseBdev1", 00:19:50.564 "aliases": [ 00:19:50.564 "fb616bda-bf58-4281-b63d-213250559186" 00:19:50.564 ], 00:19:50.564 "product_name": "Malloc disk", 00:19:50.564 "block_size": 512, 00:19:50.564 "num_blocks": 65536, 00:19:50.564 "uuid": "fb616bda-bf58-4281-b63d-213250559186", 00:19:50.564 "assigned_rate_limits": { 00:19:50.564 "rw_ios_per_sec": 0, 00:19:50.564 "rw_mbytes_per_sec": 0, 00:19:50.564 "r_mbytes_per_sec": 0, 00:19:50.564 "w_mbytes_per_sec": 0 00:19:50.564 }, 00:19:50.564 "claimed": true, 00:19:50.564 "claim_type": "exclusive_write", 00:19:50.564 "zoned": false, 00:19:50.564 "supported_io_types": { 00:19:50.564 "read": true, 00:19:50.564 "write": true, 00:19:50.564 "unmap": true, 00:19:50.564 "flush": true, 00:19:50.564 "reset": true, 00:19:50.564 "nvme_admin": false, 00:19:50.564 "nvme_io": false, 00:19:50.564 "nvme_io_md": false, 00:19:50.564 "write_zeroes": true, 00:19:50.564 "zcopy": true, 00:19:50.564 "get_zone_info": false, 00:19:50.564 "zone_management": false, 00:19:50.564 "zone_append": false, 00:19:50.564 "compare": false, 00:19:50.564 "compare_and_write": false, 00:19:50.564 "abort": true, 00:19:50.564 "seek_hole": false, 00:19:50.564 "seek_data": false, 00:19:50.564 "copy": true, 00:19:50.564 "nvme_iov_md": false 00:19:50.564 }, 00:19:50.564 "memory_domains": [ 00:19:50.564 { 00:19:50.564 "dma_device_id": "system", 00:19:50.564 "dma_device_type": 1 00:19:50.564 }, 00:19:50.564 { 00:19:50.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.564 "dma_device_type": 2 00:19:50.564 } 00:19:50.564 ], 00:19:50.564 "driver_specific": {} 00:19:50.564 } 00:19:50.564 ] 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:50.564 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:50.565 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.565 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:50.823 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.823 "name": "Existed_Raid", 00:19:50.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.823 "strip_size_kb": 64, 00:19:50.823 "state": "configuring", 00:19:50.823 "raid_level": "concat", 00:19:50.823 "superblock": false, 00:19:50.823 "num_base_bdevs": 4, 00:19:50.823 "num_base_bdevs_discovered": 3, 00:19:50.823 "num_base_bdevs_operational": 4, 00:19:50.823 "base_bdevs_list": [ 00:19:50.823 { 00:19:50.823 "name": "BaseBdev1", 00:19:50.823 "uuid": "fb616bda-bf58-4281-b63d-213250559186", 00:19:50.823 "is_configured": true, 00:19:50.823 "data_offset": 0, 00:19:50.823 "data_size": 65536 00:19:50.823 }, 00:19:50.823 { 00:19:50.823 "name": null, 00:19:50.823 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:19:50.823 "is_configured": false, 00:19:50.823 "data_offset": 0, 00:19:50.824 "data_size": 65536 00:19:50.824 }, 00:19:50.824 { 00:19:50.824 "name": "BaseBdev3", 00:19:50.824 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:19:50.824 "is_configured": true, 00:19:50.824 "data_offset": 0, 00:19:50.824 "data_size": 65536 00:19:50.824 }, 00:19:50.824 { 00:19:50.824 "name": "BaseBdev4", 00:19:50.824 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:19:50.824 "is_configured": true, 00:19:50.824 "data_offset": 0, 00:19:50.824 "data_size": 65536 00:19:50.824 } 00:19:50.824 ] 00:19:50.824 }' 00:19:50.824 00:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.824 00:14:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:51.391 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.391 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:51.650 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:51.650 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:51.909 [2024-07-16 00:14:38.745700] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:51.909 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:51.909 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.910 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:51.910 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:51.910 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.910 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.910 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.910 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.910 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.910 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.910 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.910 00:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:52.169 00:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:52.169 "name": "Existed_Raid", 00:19:52.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:52.169 "strip_size_kb": 64, 00:19:52.169 "state": "configuring", 00:19:52.169 "raid_level": "concat", 00:19:52.169 "superblock": false, 00:19:52.169 "num_base_bdevs": 4, 00:19:52.169 "num_base_bdevs_discovered": 2, 00:19:52.169 "num_base_bdevs_operational": 4, 00:19:52.169 "base_bdevs_list": [ 00:19:52.169 { 00:19:52.169 "name": "BaseBdev1", 00:19:52.169 "uuid": "fb616bda-bf58-4281-b63d-213250559186", 00:19:52.169 "is_configured": true, 00:19:52.169 "data_offset": 0, 00:19:52.169 "data_size": 65536 00:19:52.169 }, 00:19:52.169 { 00:19:52.169 "name": null, 00:19:52.169 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:19:52.169 "is_configured": false, 00:19:52.169 "data_offset": 0, 00:19:52.169 "data_size": 65536 00:19:52.169 }, 00:19:52.169 { 00:19:52.169 "name": null, 00:19:52.169 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:19:52.169 "is_configured": false, 00:19:52.169 "data_offset": 0, 00:19:52.169 "data_size": 65536 00:19:52.169 }, 00:19:52.169 { 00:19:52.169 "name": "BaseBdev4", 00:19:52.169 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:19:52.169 "is_configured": true, 00:19:52.169 "data_offset": 0, 00:19:52.169 "data_size": 65536 00:19:52.169 } 00:19:52.169 ] 00:19:52.169 }' 00:19:52.169 00:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:52.169 00:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:52.737 00:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:52.737 00:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.996 00:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:52.996 00:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:53.255 [2024-07-16 00:14:40.121361] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.255 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:53.822 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.822 "name": "Existed_Raid", 00:19:53.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:53.822 "strip_size_kb": 64, 00:19:53.822 "state": "configuring", 00:19:53.822 "raid_level": "concat", 00:19:53.822 "superblock": false, 00:19:53.822 "num_base_bdevs": 4, 00:19:53.822 "num_base_bdevs_discovered": 3, 00:19:53.822 "num_base_bdevs_operational": 4, 00:19:53.822 "base_bdevs_list": [ 00:19:53.822 { 00:19:53.822 "name": "BaseBdev1", 00:19:53.822 "uuid": "fb616bda-bf58-4281-b63d-213250559186", 00:19:53.822 "is_configured": true, 00:19:53.823 "data_offset": 0, 00:19:53.823 "data_size": 65536 00:19:53.823 }, 00:19:53.823 { 00:19:53.823 "name": null, 00:19:53.823 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:19:53.823 "is_configured": false, 00:19:53.823 "data_offset": 0, 00:19:53.823 "data_size": 65536 00:19:53.823 }, 00:19:53.823 { 00:19:53.823 "name": "BaseBdev3", 00:19:53.823 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:19:53.823 "is_configured": true, 00:19:53.823 "data_offset": 0, 00:19:53.823 "data_size": 65536 00:19:53.823 }, 00:19:53.823 { 00:19:53.823 "name": "BaseBdev4", 00:19:53.823 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:19:53.823 "is_configured": true, 00:19:53.823 "data_offset": 0, 00:19:53.823 "data_size": 65536 00:19:53.823 } 00:19:53.823 ] 00:19:53.823 }' 00:19:53.823 00:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.823 00:14:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:54.391 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.391 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:54.650 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:54.650 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:54.909 [2024-07-16 00:14:41.745674] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:54.909 00:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.168 00:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:55.168 "name": "Existed_Raid", 00:19:55.168 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:55.168 "strip_size_kb": 64, 00:19:55.168 "state": "configuring", 00:19:55.168 "raid_level": "concat", 00:19:55.168 "superblock": false, 00:19:55.168 "num_base_bdevs": 4, 00:19:55.168 "num_base_bdevs_discovered": 2, 00:19:55.168 "num_base_bdevs_operational": 4, 00:19:55.168 "base_bdevs_list": [ 00:19:55.168 { 00:19:55.168 "name": null, 00:19:55.168 "uuid": "fb616bda-bf58-4281-b63d-213250559186", 00:19:55.168 "is_configured": false, 00:19:55.168 "data_offset": 0, 00:19:55.168 "data_size": 65536 00:19:55.168 }, 00:19:55.168 { 00:19:55.168 "name": null, 00:19:55.168 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:19:55.168 "is_configured": false, 00:19:55.168 "data_offset": 0, 00:19:55.168 "data_size": 65536 00:19:55.168 }, 00:19:55.168 { 00:19:55.168 "name": "BaseBdev3", 00:19:55.168 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:19:55.169 "is_configured": true, 00:19:55.169 "data_offset": 0, 00:19:55.169 "data_size": 65536 00:19:55.169 }, 00:19:55.169 { 00:19:55.169 "name": "BaseBdev4", 00:19:55.169 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:19:55.169 "is_configured": true, 00:19:55.169 "data_offset": 0, 00:19:55.169 "data_size": 65536 00:19:55.169 } 00:19:55.169 ] 00:19:55.169 }' 00:19:55.169 00:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:55.169 00:14:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:55.735 00:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.735 00:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:55.995 00:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:55.995 00:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:56.255 [2024-07-16 00:14:43.027734] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.255 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:56.513 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.513 "name": "Existed_Raid", 00:19:56.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.513 "strip_size_kb": 64, 00:19:56.513 "state": "configuring", 00:19:56.513 "raid_level": "concat", 00:19:56.513 "superblock": false, 00:19:56.513 "num_base_bdevs": 4, 00:19:56.513 "num_base_bdevs_discovered": 3, 00:19:56.513 "num_base_bdevs_operational": 4, 00:19:56.513 "base_bdevs_list": [ 00:19:56.513 { 00:19:56.513 "name": null, 00:19:56.513 "uuid": "fb616bda-bf58-4281-b63d-213250559186", 00:19:56.513 "is_configured": false, 00:19:56.513 "data_offset": 0, 00:19:56.513 "data_size": 65536 00:19:56.513 }, 00:19:56.513 { 00:19:56.513 "name": "BaseBdev2", 00:19:56.513 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:19:56.513 "is_configured": true, 00:19:56.513 "data_offset": 0, 00:19:56.513 "data_size": 65536 00:19:56.513 }, 00:19:56.513 { 00:19:56.513 "name": "BaseBdev3", 00:19:56.513 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:19:56.513 "is_configured": true, 00:19:56.513 "data_offset": 0, 00:19:56.513 "data_size": 65536 00:19:56.513 }, 00:19:56.513 { 00:19:56.513 "name": "BaseBdev4", 00:19:56.513 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:19:56.513 "is_configured": true, 00:19:56.513 "data_offset": 0, 00:19:56.513 "data_size": 65536 00:19:56.513 } 00:19:56.513 ] 00:19:56.513 }' 00:19:56.513 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.513 00:14:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:57.080 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.080 00:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:57.338 00:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:57.339 00:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.339 00:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:57.597 00:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u fb616bda-bf58-4281-b63d-213250559186 00:19:57.855 [2024-07-16 00:14:44.663383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:57.855 [2024-07-16 00:14:44.663422] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9ba040 00:19:57.855 [2024-07-16 00:14:44.663430] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:57.855 [2024-07-16 00:14:44.663625] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9b5a70 00:19:57.855 [2024-07-16 00:14:44.663742] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9ba040 00:19:57.855 [2024-07-16 00:14:44.663752] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9ba040 00:19:57.855 [2024-07-16 00:14:44.663911] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:57.855 NewBaseBdev 00:19:57.856 00:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:57.856 00:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:57.856 00:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:57.856 00:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:57.856 00:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:57.856 00:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:57.856 00:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:58.114 00:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:58.373 [ 00:19:58.373 { 00:19:58.373 "name": "NewBaseBdev", 00:19:58.373 "aliases": [ 00:19:58.373 "fb616bda-bf58-4281-b63d-213250559186" 00:19:58.373 ], 00:19:58.373 "product_name": "Malloc disk", 00:19:58.373 "block_size": 512, 00:19:58.373 "num_blocks": 65536, 00:19:58.373 "uuid": "fb616bda-bf58-4281-b63d-213250559186", 00:19:58.373 "assigned_rate_limits": { 00:19:58.373 "rw_ios_per_sec": 0, 00:19:58.373 "rw_mbytes_per_sec": 0, 00:19:58.373 "r_mbytes_per_sec": 0, 00:19:58.373 "w_mbytes_per_sec": 0 00:19:58.373 }, 00:19:58.373 "claimed": true, 00:19:58.373 "claim_type": "exclusive_write", 00:19:58.373 "zoned": false, 00:19:58.373 "supported_io_types": { 00:19:58.373 "read": true, 00:19:58.373 "write": true, 00:19:58.373 "unmap": true, 00:19:58.373 "flush": true, 00:19:58.373 "reset": true, 00:19:58.373 "nvme_admin": false, 00:19:58.373 "nvme_io": false, 00:19:58.373 "nvme_io_md": false, 00:19:58.373 "write_zeroes": true, 00:19:58.373 "zcopy": true, 00:19:58.373 "get_zone_info": false, 00:19:58.373 "zone_management": false, 00:19:58.373 "zone_append": false, 00:19:58.373 "compare": false, 00:19:58.373 "compare_and_write": false, 00:19:58.373 "abort": true, 00:19:58.373 "seek_hole": false, 00:19:58.373 "seek_data": false, 00:19:58.373 "copy": true, 00:19:58.373 "nvme_iov_md": false 00:19:58.373 }, 00:19:58.373 "memory_domains": [ 00:19:58.373 { 00:19:58.373 "dma_device_id": "system", 00:19:58.373 "dma_device_type": 1 00:19:58.373 }, 00:19:58.373 { 00:19:58.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:58.373 "dma_device_type": 2 00:19:58.373 } 00:19:58.373 ], 00:19:58.373 "driver_specific": {} 00:19:58.373 } 00:19:58.373 ] 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.373 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:58.632 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.632 "name": "Existed_Raid", 00:19:58.632 "uuid": "f0372062-34b5-459d-91bb-3898d7291e64", 00:19:58.632 "strip_size_kb": 64, 00:19:58.632 "state": "online", 00:19:58.632 "raid_level": "concat", 00:19:58.632 "superblock": false, 00:19:58.632 "num_base_bdevs": 4, 00:19:58.632 "num_base_bdevs_discovered": 4, 00:19:58.632 "num_base_bdevs_operational": 4, 00:19:58.632 "base_bdevs_list": [ 00:19:58.632 { 00:19:58.632 "name": "NewBaseBdev", 00:19:58.632 "uuid": "fb616bda-bf58-4281-b63d-213250559186", 00:19:58.632 "is_configured": true, 00:19:58.632 "data_offset": 0, 00:19:58.632 "data_size": 65536 00:19:58.632 }, 00:19:58.632 { 00:19:58.632 "name": "BaseBdev2", 00:19:58.632 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:19:58.632 "is_configured": true, 00:19:58.632 "data_offset": 0, 00:19:58.632 "data_size": 65536 00:19:58.632 }, 00:19:58.632 { 00:19:58.632 "name": "BaseBdev3", 00:19:58.632 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:19:58.632 "is_configured": true, 00:19:58.632 "data_offset": 0, 00:19:58.632 "data_size": 65536 00:19:58.632 }, 00:19:58.632 { 00:19:58.632 "name": "BaseBdev4", 00:19:58.632 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:19:58.632 "is_configured": true, 00:19:58.632 "data_offset": 0, 00:19:58.632 "data_size": 65536 00:19:58.632 } 00:19:58.632 ] 00:19:58.632 }' 00:19:58.632 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.632 00:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:59.200 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:59.200 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:59.200 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:59.200 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:59.200 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:59.200 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:59.200 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:59.200 00:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:59.460 [2024-07-16 00:14:46.227877] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:59.460 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:59.460 "name": "Existed_Raid", 00:19:59.460 "aliases": [ 00:19:59.460 "f0372062-34b5-459d-91bb-3898d7291e64" 00:19:59.460 ], 00:19:59.460 "product_name": "Raid Volume", 00:19:59.460 "block_size": 512, 00:19:59.460 "num_blocks": 262144, 00:19:59.460 "uuid": "f0372062-34b5-459d-91bb-3898d7291e64", 00:19:59.460 "assigned_rate_limits": { 00:19:59.460 "rw_ios_per_sec": 0, 00:19:59.460 "rw_mbytes_per_sec": 0, 00:19:59.460 "r_mbytes_per_sec": 0, 00:19:59.460 "w_mbytes_per_sec": 0 00:19:59.460 }, 00:19:59.460 "claimed": false, 00:19:59.460 "zoned": false, 00:19:59.460 "supported_io_types": { 00:19:59.460 "read": true, 00:19:59.460 "write": true, 00:19:59.460 "unmap": true, 00:19:59.460 "flush": true, 00:19:59.460 "reset": true, 00:19:59.460 "nvme_admin": false, 00:19:59.460 "nvme_io": false, 00:19:59.460 "nvme_io_md": false, 00:19:59.460 "write_zeroes": true, 00:19:59.460 "zcopy": false, 00:19:59.460 "get_zone_info": false, 00:19:59.460 "zone_management": false, 00:19:59.460 "zone_append": false, 00:19:59.460 "compare": false, 00:19:59.460 "compare_and_write": false, 00:19:59.460 "abort": false, 00:19:59.460 "seek_hole": false, 00:19:59.460 "seek_data": false, 00:19:59.460 "copy": false, 00:19:59.460 "nvme_iov_md": false 00:19:59.460 }, 00:19:59.460 "memory_domains": [ 00:19:59.460 { 00:19:59.460 "dma_device_id": "system", 00:19:59.460 "dma_device_type": 1 00:19:59.460 }, 00:19:59.460 { 00:19:59.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:59.460 "dma_device_type": 2 00:19:59.460 }, 00:19:59.460 { 00:19:59.460 "dma_device_id": "system", 00:19:59.460 "dma_device_type": 1 00:19:59.460 }, 00:19:59.460 { 00:19:59.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:59.460 "dma_device_type": 2 00:19:59.460 }, 00:19:59.460 { 00:19:59.460 "dma_device_id": "system", 00:19:59.460 "dma_device_type": 1 00:19:59.460 }, 00:19:59.460 { 00:19:59.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:59.460 "dma_device_type": 2 00:19:59.460 }, 00:19:59.460 { 00:19:59.460 "dma_device_id": "system", 00:19:59.460 "dma_device_type": 1 00:19:59.460 }, 00:19:59.460 { 00:19:59.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:59.460 "dma_device_type": 2 00:19:59.460 } 00:19:59.460 ], 00:19:59.460 "driver_specific": { 00:19:59.460 "raid": { 00:19:59.460 "uuid": "f0372062-34b5-459d-91bb-3898d7291e64", 00:19:59.460 "strip_size_kb": 64, 00:19:59.460 "state": "online", 00:19:59.460 "raid_level": "concat", 00:19:59.460 "superblock": false, 00:19:59.460 "num_base_bdevs": 4, 00:19:59.460 "num_base_bdevs_discovered": 4, 00:19:59.460 "num_base_bdevs_operational": 4, 00:19:59.460 "base_bdevs_list": [ 00:19:59.460 { 00:19:59.460 "name": "NewBaseBdev", 00:19:59.460 "uuid": "fb616bda-bf58-4281-b63d-213250559186", 00:19:59.460 "is_configured": true, 00:19:59.460 "data_offset": 0, 00:19:59.460 "data_size": 65536 00:19:59.460 }, 00:19:59.460 { 00:19:59.460 "name": "BaseBdev2", 00:19:59.460 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:19:59.460 "is_configured": true, 00:19:59.460 "data_offset": 0, 00:19:59.460 "data_size": 65536 00:19:59.460 }, 00:19:59.460 { 00:19:59.460 "name": "BaseBdev3", 00:19:59.460 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:19:59.460 "is_configured": true, 00:19:59.460 "data_offset": 0, 00:19:59.460 "data_size": 65536 00:19:59.460 }, 00:19:59.460 { 00:19:59.460 "name": "BaseBdev4", 00:19:59.460 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:19:59.460 "is_configured": true, 00:19:59.460 "data_offset": 0, 00:19:59.460 "data_size": 65536 00:19:59.460 } 00:19:59.460 ] 00:19:59.460 } 00:19:59.460 } 00:19:59.460 }' 00:19:59.460 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:59.460 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:59.460 BaseBdev2 00:19:59.460 BaseBdev3 00:19:59.460 BaseBdev4' 00:19:59.460 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:59.460 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:59.460 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:59.809 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:59.809 "name": "NewBaseBdev", 00:19:59.809 "aliases": [ 00:19:59.809 "fb616bda-bf58-4281-b63d-213250559186" 00:19:59.809 ], 00:19:59.809 "product_name": "Malloc disk", 00:19:59.809 "block_size": 512, 00:19:59.809 "num_blocks": 65536, 00:19:59.809 "uuid": "fb616bda-bf58-4281-b63d-213250559186", 00:19:59.809 "assigned_rate_limits": { 00:19:59.809 "rw_ios_per_sec": 0, 00:19:59.809 "rw_mbytes_per_sec": 0, 00:19:59.809 "r_mbytes_per_sec": 0, 00:19:59.809 "w_mbytes_per_sec": 0 00:19:59.809 }, 00:19:59.809 "claimed": true, 00:19:59.809 "claim_type": "exclusive_write", 00:19:59.809 "zoned": false, 00:19:59.809 "supported_io_types": { 00:19:59.809 "read": true, 00:19:59.809 "write": true, 00:19:59.809 "unmap": true, 00:19:59.809 "flush": true, 00:19:59.809 "reset": true, 00:19:59.809 "nvme_admin": false, 00:19:59.809 "nvme_io": false, 00:19:59.809 "nvme_io_md": false, 00:19:59.809 "write_zeroes": true, 00:19:59.809 "zcopy": true, 00:19:59.809 "get_zone_info": false, 00:19:59.809 "zone_management": false, 00:19:59.809 "zone_append": false, 00:19:59.809 "compare": false, 00:19:59.809 "compare_and_write": false, 00:19:59.809 "abort": true, 00:19:59.809 "seek_hole": false, 00:19:59.809 "seek_data": false, 00:19:59.809 "copy": true, 00:19:59.809 "nvme_iov_md": false 00:19:59.809 }, 00:19:59.809 "memory_domains": [ 00:19:59.809 { 00:19:59.809 "dma_device_id": "system", 00:19:59.809 "dma_device_type": 1 00:19:59.809 }, 00:19:59.809 { 00:19:59.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:59.809 "dma_device_type": 2 00:19:59.809 } 00:19:59.809 ], 00:19:59.809 "driver_specific": {} 00:19:59.809 }' 00:19:59.809 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:59.809 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:59.809 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:59.809 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:59.809 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:00.074 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:00.074 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:00.074 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:00.074 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:00.074 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:00.074 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:00.074 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:00.074 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:00.074 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:00.074 00:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:00.331 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:00.331 "name": "BaseBdev2", 00:20:00.331 "aliases": [ 00:20:00.331 "04e5b3ed-26ea-4004-8fe0-7d6d64433168" 00:20:00.331 ], 00:20:00.331 "product_name": "Malloc disk", 00:20:00.331 "block_size": 512, 00:20:00.331 "num_blocks": 65536, 00:20:00.331 "uuid": "04e5b3ed-26ea-4004-8fe0-7d6d64433168", 00:20:00.331 "assigned_rate_limits": { 00:20:00.331 "rw_ios_per_sec": 0, 00:20:00.331 "rw_mbytes_per_sec": 0, 00:20:00.331 "r_mbytes_per_sec": 0, 00:20:00.331 "w_mbytes_per_sec": 0 00:20:00.331 }, 00:20:00.332 "claimed": true, 00:20:00.332 "claim_type": "exclusive_write", 00:20:00.332 "zoned": false, 00:20:00.332 "supported_io_types": { 00:20:00.332 "read": true, 00:20:00.332 "write": true, 00:20:00.332 "unmap": true, 00:20:00.332 "flush": true, 00:20:00.332 "reset": true, 00:20:00.332 "nvme_admin": false, 00:20:00.332 "nvme_io": false, 00:20:00.332 "nvme_io_md": false, 00:20:00.332 "write_zeroes": true, 00:20:00.332 "zcopy": true, 00:20:00.332 "get_zone_info": false, 00:20:00.332 "zone_management": false, 00:20:00.332 "zone_append": false, 00:20:00.332 "compare": false, 00:20:00.332 "compare_and_write": false, 00:20:00.332 "abort": true, 00:20:00.332 "seek_hole": false, 00:20:00.332 "seek_data": false, 00:20:00.332 "copy": true, 00:20:00.332 "nvme_iov_md": false 00:20:00.332 }, 00:20:00.332 "memory_domains": [ 00:20:00.332 { 00:20:00.332 "dma_device_id": "system", 00:20:00.332 "dma_device_type": 1 00:20:00.332 }, 00:20:00.332 { 00:20:00.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:00.332 "dma_device_type": 2 00:20:00.332 } 00:20:00.332 ], 00:20:00.332 "driver_specific": {} 00:20:00.332 }' 00:20:00.332 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:00.332 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:00.332 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:00.332 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:00.332 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:00.590 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:00.590 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:00.590 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:00.590 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:00.590 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:00.590 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:00.590 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:00.590 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:00.590 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:00.590 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:00.849 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:00.849 "name": "BaseBdev3", 00:20:00.849 "aliases": [ 00:20:00.849 "b7910928-4a7f-459b-aa3a-3b8a9126a034" 00:20:00.849 ], 00:20:00.849 "product_name": "Malloc disk", 00:20:00.849 "block_size": 512, 00:20:00.849 "num_blocks": 65536, 00:20:00.849 "uuid": "b7910928-4a7f-459b-aa3a-3b8a9126a034", 00:20:00.849 "assigned_rate_limits": { 00:20:00.849 "rw_ios_per_sec": 0, 00:20:00.849 "rw_mbytes_per_sec": 0, 00:20:00.849 "r_mbytes_per_sec": 0, 00:20:00.849 "w_mbytes_per_sec": 0 00:20:00.849 }, 00:20:00.849 "claimed": true, 00:20:00.849 "claim_type": "exclusive_write", 00:20:00.849 "zoned": false, 00:20:00.849 "supported_io_types": { 00:20:00.849 "read": true, 00:20:00.849 "write": true, 00:20:00.849 "unmap": true, 00:20:00.849 "flush": true, 00:20:00.849 "reset": true, 00:20:00.849 "nvme_admin": false, 00:20:00.849 "nvme_io": false, 00:20:00.849 "nvme_io_md": false, 00:20:00.849 "write_zeroes": true, 00:20:00.849 "zcopy": true, 00:20:00.849 "get_zone_info": false, 00:20:00.849 "zone_management": false, 00:20:00.849 "zone_append": false, 00:20:00.849 "compare": false, 00:20:00.849 "compare_and_write": false, 00:20:00.849 "abort": true, 00:20:00.849 "seek_hole": false, 00:20:00.849 "seek_data": false, 00:20:00.849 "copy": true, 00:20:00.849 "nvme_iov_md": false 00:20:00.849 }, 00:20:00.849 "memory_domains": [ 00:20:00.849 { 00:20:00.849 "dma_device_id": "system", 00:20:00.849 "dma_device_type": 1 00:20:00.849 }, 00:20:00.849 { 00:20:00.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:00.849 "dma_device_type": 2 00:20:00.849 } 00:20:00.849 ], 00:20:00.849 "driver_specific": {} 00:20:00.849 }' 00:20:00.849 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:00.849 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:01.108 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:01.108 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:01.108 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:01.108 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:01.108 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:01.108 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:01.108 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:01.108 00:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:01.108 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:01.366 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:01.366 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:01.366 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:01.366 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:01.625 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:01.625 "name": "BaseBdev4", 00:20:01.625 "aliases": [ 00:20:01.625 "1423ac75-33d8-4681-a56b-e119b4984698" 00:20:01.625 ], 00:20:01.625 "product_name": "Malloc disk", 00:20:01.625 "block_size": 512, 00:20:01.625 "num_blocks": 65536, 00:20:01.625 "uuid": "1423ac75-33d8-4681-a56b-e119b4984698", 00:20:01.625 "assigned_rate_limits": { 00:20:01.625 "rw_ios_per_sec": 0, 00:20:01.625 "rw_mbytes_per_sec": 0, 00:20:01.625 "r_mbytes_per_sec": 0, 00:20:01.625 "w_mbytes_per_sec": 0 00:20:01.625 }, 00:20:01.625 "claimed": true, 00:20:01.625 "claim_type": "exclusive_write", 00:20:01.625 "zoned": false, 00:20:01.625 "supported_io_types": { 00:20:01.625 "read": true, 00:20:01.625 "write": true, 00:20:01.625 "unmap": true, 00:20:01.625 "flush": true, 00:20:01.625 "reset": true, 00:20:01.625 "nvme_admin": false, 00:20:01.625 "nvme_io": false, 00:20:01.625 "nvme_io_md": false, 00:20:01.625 "write_zeroes": true, 00:20:01.625 "zcopy": true, 00:20:01.625 "get_zone_info": false, 00:20:01.625 "zone_management": false, 00:20:01.625 "zone_append": false, 00:20:01.625 "compare": false, 00:20:01.625 "compare_and_write": false, 00:20:01.625 "abort": true, 00:20:01.625 "seek_hole": false, 00:20:01.625 "seek_data": false, 00:20:01.625 "copy": true, 00:20:01.625 "nvme_iov_md": false 00:20:01.625 }, 00:20:01.625 "memory_domains": [ 00:20:01.625 { 00:20:01.625 "dma_device_id": "system", 00:20:01.625 "dma_device_type": 1 00:20:01.625 }, 00:20:01.625 { 00:20:01.625 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.625 "dma_device_type": 2 00:20:01.625 } 00:20:01.625 ], 00:20:01.625 "driver_specific": {} 00:20:01.625 }' 00:20:01.625 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:01.625 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:01.625 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:01.625 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:01.625 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:01.625 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:01.625 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:01.625 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:01.884 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:01.884 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:01.884 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:01.884 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:01.884 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:02.143 [2024-07-16 00:14:48.890635] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:02.143 [2024-07-16 00:14:48.890666] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:02.143 [2024-07-16 00:14:48.890718] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:02.143 [2024-07-16 00:14:48.890776] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:02.143 [2024-07-16 00:14:48.890788] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9ba040 name Existed_Raid, state offline 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3565942 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3565942 ']' 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3565942 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3565942 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3565942' 00:20:02.143 killing process with pid 3565942 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3565942 00:20:02.143 [2024-07-16 00:14:48.956219] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:02.143 00:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3565942 00:20:02.143 [2024-07-16 00:14:48.991965] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:02.402 00:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:02.402 00:20:02.402 real 0m33.951s 00:20:02.402 user 1m2.359s 00:20:02.402 sys 0m5.983s 00:20:02.402 00:14:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:02.402 00:14:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.402 ************************************ 00:20:02.402 END TEST raid_state_function_test 00:20:02.402 ************************************ 00:20:02.402 00:14:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:02.402 00:14:49 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:20:02.402 00:14:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:02.402 00:14:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:02.402 00:14:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:02.402 ************************************ 00:20:02.402 START TEST raid_state_function_test_sb 00:20:02.402 ************************************ 00:20:02.402 00:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:20:02.402 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:02.402 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:02.402 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:02.402 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3570975 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3570975' 00:20:02.403 Process raid pid: 3570975 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3570975 /var/tmp/spdk-raid.sock 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3570975 ']' 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:02.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:02.403 00:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:02.403 [2024-07-16 00:14:49.351554] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:20:02.403 [2024-07-16 00:14:49.351623] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:02.662 [2024-07-16 00:14:49.483308] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:02.662 [2024-07-16 00:14:49.589421] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:02.921 [2024-07-16 00:14:49.654892] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:02.921 [2024-07-16 00:14:49.654938] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:03.489 00:14:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:03.490 00:14:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:03.490 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:03.749 [2024-07-16 00:14:50.526178] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:03.749 [2024-07-16 00:14:50.526221] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:03.749 [2024-07-16 00:14:50.526233] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:03.749 [2024-07-16 00:14:50.526245] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:03.749 [2024-07-16 00:14:50.526253] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:03.749 [2024-07-16 00:14:50.526264] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:03.749 [2024-07-16 00:14:50.526273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:03.749 [2024-07-16 00:14:50.526284] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:03.749 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.007 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.007 "name": "Existed_Raid", 00:20:04.007 "uuid": "b90543b7-f638-4112-bd7c-aabe4d921d36", 00:20:04.007 "strip_size_kb": 64, 00:20:04.007 "state": "configuring", 00:20:04.007 "raid_level": "concat", 00:20:04.007 "superblock": true, 00:20:04.007 "num_base_bdevs": 4, 00:20:04.007 "num_base_bdevs_discovered": 0, 00:20:04.007 "num_base_bdevs_operational": 4, 00:20:04.007 "base_bdevs_list": [ 00:20:04.007 { 00:20:04.007 "name": "BaseBdev1", 00:20:04.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.007 "is_configured": false, 00:20:04.007 "data_offset": 0, 00:20:04.007 "data_size": 0 00:20:04.007 }, 00:20:04.007 { 00:20:04.007 "name": "BaseBdev2", 00:20:04.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.007 "is_configured": false, 00:20:04.008 "data_offset": 0, 00:20:04.008 "data_size": 0 00:20:04.008 }, 00:20:04.008 { 00:20:04.008 "name": "BaseBdev3", 00:20:04.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.008 "is_configured": false, 00:20:04.008 "data_offset": 0, 00:20:04.008 "data_size": 0 00:20:04.008 }, 00:20:04.008 { 00:20:04.008 "name": "BaseBdev4", 00:20:04.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.008 "is_configured": false, 00:20:04.008 "data_offset": 0, 00:20:04.008 "data_size": 0 00:20:04.008 } 00:20:04.008 ] 00:20:04.008 }' 00:20:04.008 00:14:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.008 00:14:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:04.573 00:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:04.831 [2024-07-16 00:14:51.576819] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:04.831 [2024-07-16 00:14:51.576849] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c8aa0 name Existed_Raid, state configuring 00:20:04.831 00:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:05.089 [2024-07-16 00:14:51.825503] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:05.089 [2024-07-16 00:14:51.825528] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:05.089 [2024-07-16 00:14:51.825538] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:05.089 [2024-07-16 00:14:51.825550] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:05.089 [2024-07-16 00:14:51.825559] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:05.090 [2024-07-16 00:14:51.825570] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:05.090 [2024-07-16 00:14:51.825578] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:05.090 [2024-07-16 00:14:51.825589] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:05.090 00:14:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:05.348 [2024-07-16 00:14:52.084070] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:05.348 BaseBdev1 00:20:05.348 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:05.348 00:14:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:05.348 00:14:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:05.348 00:14:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:05.348 00:14:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:05.348 00:14:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:05.348 00:14:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:05.605 00:14:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:05.862 [ 00:20:05.862 { 00:20:05.862 "name": "BaseBdev1", 00:20:05.862 "aliases": [ 00:20:05.862 "398790e6-8047-4c5a-a66e-899e9dfc5a38" 00:20:05.862 ], 00:20:05.862 "product_name": "Malloc disk", 00:20:05.862 "block_size": 512, 00:20:05.862 "num_blocks": 65536, 00:20:05.862 "uuid": "398790e6-8047-4c5a-a66e-899e9dfc5a38", 00:20:05.862 "assigned_rate_limits": { 00:20:05.862 "rw_ios_per_sec": 0, 00:20:05.862 "rw_mbytes_per_sec": 0, 00:20:05.862 "r_mbytes_per_sec": 0, 00:20:05.862 "w_mbytes_per_sec": 0 00:20:05.862 }, 00:20:05.862 "claimed": true, 00:20:05.862 "claim_type": "exclusive_write", 00:20:05.862 "zoned": false, 00:20:05.862 "supported_io_types": { 00:20:05.862 "read": true, 00:20:05.862 "write": true, 00:20:05.862 "unmap": true, 00:20:05.862 "flush": true, 00:20:05.862 "reset": true, 00:20:05.862 "nvme_admin": false, 00:20:05.862 "nvme_io": false, 00:20:05.862 "nvme_io_md": false, 00:20:05.862 "write_zeroes": true, 00:20:05.863 "zcopy": true, 00:20:05.863 "get_zone_info": false, 00:20:05.863 "zone_management": false, 00:20:05.863 "zone_append": false, 00:20:05.863 "compare": false, 00:20:05.863 "compare_and_write": false, 00:20:05.863 "abort": true, 00:20:05.863 "seek_hole": false, 00:20:05.863 "seek_data": false, 00:20:05.863 "copy": true, 00:20:05.863 "nvme_iov_md": false 00:20:05.863 }, 00:20:05.863 "memory_domains": [ 00:20:05.863 { 00:20:05.863 "dma_device_id": "system", 00:20:05.863 "dma_device_type": 1 00:20:05.863 }, 00:20:05.863 { 00:20:05.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.863 "dma_device_type": 2 00:20:05.863 } 00:20:05.863 ], 00:20:05.863 "driver_specific": {} 00:20:05.863 } 00:20:05.863 ] 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.863 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:06.121 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:06.121 "name": "Existed_Raid", 00:20:06.121 "uuid": "4de256fa-f5f2-43b2-a638-2b7d01ebe097", 00:20:06.121 "strip_size_kb": 64, 00:20:06.121 "state": "configuring", 00:20:06.121 "raid_level": "concat", 00:20:06.121 "superblock": true, 00:20:06.121 "num_base_bdevs": 4, 00:20:06.121 "num_base_bdevs_discovered": 1, 00:20:06.121 "num_base_bdevs_operational": 4, 00:20:06.121 "base_bdevs_list": [ 00:20:06.121 { 00:20:06.121 "name": "BaseBdev1", 00:20:06.121 "uuid": "398790e6-8047-4c5a-a66e-899e9dfc5a38", 00:20:06.121 "is_configured": true, 00:20:06.121 "data_offset": 2048, 00:20:06.121 "data_size": 63488 00:20:06.121 }, 00:20:06.121 { 00:20:06.121 "name": "BaseBdev2", 00:20:06.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.121 "is_configured": false, 00:20:06.121 "data_offset": 0, 00:20:06.121 "data_size": 0 00:20:06.121 }, 00:20:06.121 { 00:20:06.121 "name": "BaseBdev3", 00:20:06.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.121 "is_configured": false, 00:20:06.121 "data_offset": 0, 00:20:06.121 "data_size": 0 00:20:06.121 }, 00:20:06.121 { 00:20:06.121 "name": "BaseBdev4", 00:20:06.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.121 "is_configured": false, 00:20:06.121 "data_offset": 0, 00:20:06.121 "data_size": 0 00:20:06.121 } 00:20:06.121 ] 00:20:06.121 }' 00:20:06.121 00:14:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:06.121 00:14:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:06.693 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:06.954 [2024-07-16 00:14:53.664257] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:06.954 [2024-07-16 00:14:53.664292] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c8310 name Existed_Raid, state configuring 00:20:06.954 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:07.211 [2024-07-16 00:14:53.912965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:07.211 [2024-07-16 00:14:53.914387] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:07.211 [2024-07-16 00:14:53.914419] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:07.211 [2024-07-16 00:14:53.914429] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:07.211 [2024-07-16 00:14:53.914441] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:07.211 [2024-07-16 00:14:53.914450] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:07.211 [2024-07-16 00:14:53.914461] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.211 00:14:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.777 00:14:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.777 "name": "Existed_Raid", 00:20:07.777 "uuid": "966a7f99-2203-482a-98d8-50ee181f39f0", 00:20:07.777 "strip_size_kb": 64, 00:20:07.777 "state": "configuring", 00:20:07.777 "raid_level": "concat", 00:20:07.777 "superblock": true, 00:20:07.777 "num_base_bdevs": 4, 00:20:07.777 "num_base_bdevs_discovered": 1, 00:20:07.777 "num_base_bdevs_operational": 4, 00:20:07.777 "base_bdevs_list": [ 00:20:07.777 { 00:20:07.777 "name": "BaseBdev1", 00:20:07.777 "uuid": "398790e6-8047-4c5a-a66e-899e9dfc5a38", 00:20:07.777 "is_configured": true, 00:20:07.777 "data_offset": 2048, 00:20:07.777 "data_size": 63488 00:20:07.777 }, 00:20:07.777 { 00:20:07.777 "name": "BaseBdev2", 00:20:07.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.777 "is_configured": false, 00:20:07.777 "data_offset": 0, 00:20:07.777 "data_size": 0 00:20:07.777 }, 00:20:07.777 { 00:20:07.777 "name": "BaseBdev3", 00:20:07.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.777 "is_configured": false, 00:20:07.777 "data_offset": 0, 00:20:07.777 "data_size": 0 00:20:07.777 }, 00:20:07.777 { 00:20:07.777 "name": "BaseBdev4", 00:20:07.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.777 "is_configured": false, 00:20:07.777 "data_offset": 0, 00:20:07.777 "data_size": 0 00:20:07.777 } 00:20:07.777 ] 00:20:07.777 }' 00:20:07.777 00:14:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.777 00:14:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:08.343 00:14:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:08.911 [2024-07-16 00:14:55.553850] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:08.911 BaseBdev2 00:20:08.911 00:14:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:08.911 00:14:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:08.911 00:14:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:08.911 00:14:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:08.911 00:14:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:08.911 00:14:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:08.911 00:14:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:08.911 00:14:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:09.478 [ 00:20:09.478 { 00:20:09.478 "name": "BaseBdev2", 00:20:09.478 "aliases": [ 00:20:09.478 "b2a5408d-b1b5-46d9-9bb2-258fde0dcb37" 00:20:09.478 ], 00:20:09.478 "product_name": "Malloc disk", 00:20:09.478 "block_size": 512, 00:20:09.478 "num_blocks": 65536, 00:20:09.478 "uuid": "b2a5408d-b1b5-46d9-9bb2-258fde0dcb37", 00:20:09.478 "assigned_rate_limits": { 00:20:09.478 "rw_ios_per_sec": 0, 00:20:09.478 "rw_mbytes_per_sec": 0, 00:20:09.478 "r_mbytes_per_sec": 0, 00:20:09.478 "w_mbytes_per_sec": 0 00:20:09.478 }, 00:20:09.478 "claimed": true, 00:20:09.478 "claim_type": "exclusive_write", 00:20:09.478 "zoned": false, 00:20:09.478 "supported_io_types": { 00:20:09.478 "read": true, 00:20:09.478 "write": true, 00:20:09.478 "unmap": true, 00:20:09.478 "flush": true, 00:20:09.478 "reset": true, 00:20:09.478 "nvme_admin": false, 00:20:09.478 "nvme_io": false, 00:20:09.478 "nvme_io_md": false, 00:20:09.478 "write_zeroes": true, 00:20:09.478 "zcopy": true, 00:20:09.478 "get_zone_info": false, 00:20:09.478 "zone_management": false, 00:20:09.478 "zone_append": false, 00:20:09.478 "compare": false, 00:20:09.478 "compare_and_write": false, 00:20:09.478 "abort": true, 00:20:09.478 "seek_hole": false, 00:20:09.478 "seek_data": false, 00:20:09.478 "copy": true, 00:20:09.478 "nvme_iov_md": false 00:20:09.478 }, 00:20:09.478 "memory_domains": [ 00:20:09.478 { 00:20:09.478 "dma_device_id": "system", 00:20:09.478 "dma_device_type": 1 00:20:09.478 }, 00:20:09.478 { 00:20:09.478 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.478 "dma_device_type": 2 00:20:09.478 } 00:20:09.478 ], 00:20:09.478 "driver_specific": {} 00:20:09.478 } 00:20:09.478 ] 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.478 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:09.737 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:09.737 "name": "Existed_Raid", 00:20:09.737 "uuid": "966a7f99-2203-482a-98d8-50ee181f39f0", 00:20:09.737 "strip_size_kb": 64, 00:20:09.737 "state": "configuring", 00:20:09.737 "raid_level": "concat", 00:20:09.737 "superblock": true, 00:20:09.737 "num_base_bdevs": 4, 00:20:09.737 "num_base_bdevs_discovered": 2, 00:20:09.737 "num_base_bdevs_operational": 4, 00:20:09.737 "base_bdevs_list": [ 00:20:09.737 { 00:20:09.737 "name": "BaseBdev1", 00:20:09.737 "uuid": "398790e6-8047-4c5a-a66e-899e9dfc5a38", 00:20:09.737 "is_configured": true, 00:20:09.737 "data_offset": 2048, 00:20:09.737 "data_size": 63488 00:20:09.737 }, 00:20:09.737 { 00:20:09.737 "name": "BaseBdev2", 00:20:09.737 "uuid": "b2a5408d-b1b5-46d9-9bb2-258fde0dcb37", 00:20:09.737 "is_configured": true, 00:20:09.737 "data_offset": 2048, 00:20:09.737 "data_size": 63488 00:20:09.737 }, 00:20:09.737 { 00:20:09.737 "name": "BaseBdev3", 00:20:09.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.737 "is_configured": false, 00:20:09.737 "data_offset": 0, 00:20:09.737 "data_size": 0 00:20:09.737 }, 00:20:09.737 { 00:20:09.737 "name": "BaseBdev4", 00:20:09.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.737 "is_configured": false, 00:20:09.737 "data_offset": 0, 00:20:09.737 "data_size": 0 00:20:09.737 } 00:20:09.737 ] 00:20:09.737 }' 00:20:09.737 00:14:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:09.737 00:14:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:10.303 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:10.561 [2024-07-16 00:14:57.410139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:10.561 BaseBdev3 00:20:10.561 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:10.561 00:14:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:10.561 00:14:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:10.561 00:14:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:10.561 00:14:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:10.561 00:14:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:10.561 00:14:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:10.820 00:14:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:11.078 [ 00:20:11.078 { 00:20:11.078 "name": "BaseBdev3", 00:20:11.078 "aliases": [ 00:20:11.078 "0adc11aa-37db-4dff-80b5-b9ec10cfc525" 00:20:11.078 ], 00:20:11.078 "product_name": "Malloc disk", 00:20:11.078 "block_size": 512, 00:20:11.078 "num_blocks": 65536, 00:20:11.078 "uuid": "0adc11aa-37db-4dff-80b5-b9ec10cfc525", 00:20:11.078 "assigned_rate_limits": { 00:20:11.078 "rw_ios_per_sec": 0, 00:20:11.078 "rw_mbytes_per_sec": 0, 00:20:11.078 "r_mbytes_per_sec": 0, 00:20:11.078 "w_mbytes_per_sec": 0 00:20:11.078 }, 00:20:11.078 "claimed": true, 00:20:11.078 "claim_type": "exclusive_write", 00:20:11.078 "zoned": false, 00:20:11.078 "supported_io_types": { 00:20:11.078 "read": true, 00:20:11.078 "write": true, 00:20:11.078 "unmap": true, 00:20:11.078 "flush": true, 00:20:11.078 "reset": true, 00:20:11.078 "nvme_admin": false, 00:20:11.078 "nvme_io": false, 00:20:11.078 "nvme_io_md": false, 00:20:11.078 "write_zeroes": true, 00:20:11.078 "zcopy": true, 00:20:11.078 "get_zone_info": false, 00:20:11.078 "zone_management": false, 00:20:11.078 "zone_append": false, 00:20:11.078 "compare": false, 00:20:11.078 "compare_and_write": false, 00:20:11.078 "abort": true, 00:20:11.078 "seek_hole": false, 00:20:11.079 "seek_data": false, 00:20:11.079 "copy": true, 00:20:11.079 "nvme_iov_md": false 00:20:11.079 }, 00:20:11.079 "memory_domains": [ 00:20:11.079 { 00:20:11.079 "dma_device_id": "system", 00:20:11.079 "dma_device_type": 1 00:20:11.079 }, 00:20:11.079 { 00:20:11.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.079 "dma_device_type": 2 00:20:11.079 } 00:20:11.079 ], 00:20:11.079 "driver_specific": {} 00:20:11.079 } 00:20:11.079 ] 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.079 00:14:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:11.337 00:14:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:11.337 "name": "Existed_Raid", 00:20:11.337 "uuid": "966a7f99-2203-482a-98d8-50ee181f39f0", 00:20:11.337 "strip_size_kb": 64, 00:20:11.337 "state": "configuring", 00:20:11.337 "raid_level": "concat", 00:20:11.337 "superblock": true, 00:20:11.337 "num_base_bdevs": 4, 00:20:11.337 "num_base_bdevs_discovered": 3, 00:20:11.337 "num_base_bdevs_operational": 4, 00:20:11.337 "base_bdevs_list": [ 00:20:11.337 { 00:20:11.337 "name": "BaseBdev1", 00:20:11.337 "uuid": "398790e6-8047-4c5a-a66e-899e9dfc5a38", 00:20:11.337 "is_configured": true, 00:20:11.337 "data_offset": 2048, 00:20:11.337 "data_size": 63488 00:20:11.337 }, 00:20:11.337 { 00:20:11.337 "name": "BaseBdev2", 00:20:11.337 "uuid": "b2a5408d-b1b5-46d9-9bb2-258fde0dcb37", 00:20:11.337 "is_configured": true, 00:20:11.337 "data_offset": 2048, 00:20:11.337 "data_size": 63488 00:20:11.337 }, 00:20:11.337 { 00:20:11.337 "name": "BaseBdev3", 00:20:11.337 "uuid": "0adc11aa-37db-4dff-80b5-b9ec10cfc525", 00:20:11.337 "is_configured": true, 00:20:11.337 "data_offset": 2048, 00:20:11.337 "data_size": 63488 00:20:11.337 }, 00:20:11.337 { 00:20:11.337 "name": "BaseBdev4", 00:20:11.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.337 "is_configured": false, 00:20:11.337 "data_offset": 0, 00:20:11.337 "data_size": 0 00:20:11.337 } 00:20:11.337 ] 00:20:11.337 }' 00:20:11.337 00:14:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:11.337 00:14:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:11.903 00:14:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:12.162 [2024-07-16 00:14:58.973681] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:12.162 [2024-07-16 00:14:58.973849] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13c9350 00:20:12.162 [2024-07-16 00:14:58.973864] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:12.162 [2024-07-16 00:14:58.974047] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c9020 00:20:12.162 [2024-07-16 00:14:58.974171] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13c9350 00:20:12.162 [2024-07-16 00:14:58.974182] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13c9350 00:20:12.162 [2024-07-16 00:14:58.974269] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:12.162 BaseBdev4 00:20:12.162 00:14:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:12.162 00:14:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:12.162 00:14:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:12.162 00:14:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:12.162 00:14:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:12.162 00:14:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:12.162 00:14:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:12.420 00:14:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:12.678 [ 00:20:12.678 { 00:20:12.678 "name": "BaseBdev4", 00:20:12.678 "aliases": [ 00:20:12.678 "a4c9377e-9bf0-4af6-a1e7-4d6930de1164" 00:20:12.678 ], 00:20:12.678 "product_name": "Malloc disk", 00:20:12.678 "block_size": 512, 00:20:12.678 "num_blocks": 65536, 00:20:12.678 "uuid": "a4c9377e-9bf0-4af6-a1e7-4d6930de1164", 00:20:12.678 "assigned_rate_limits": { 00:20:12.678 "rw_ios_per_sec": 0, 00:20:12.678 "rw_mbytes_per_sec": 0, 00:20:12.678 "r_mbytes_per_sec": 0, 00:20:12.678 "w_mbytes_per_sec": 0 00:20:12.678 }, 00:20:12.678 "claimed": true, 00:20:12.678 "claim_type": "exclusive_write", 00:20:12.678 "zoned": false, 00:20:12.678 "supported_io_types": { 00:20:12.678 "read": true, 00:20:12.678 "write": true, 00:20:12.678 "unmap": true, 00:20:12.678 "flush": true, 00:20:12.678 "reset": true, 00:20:12.678 "nvme_admin": false, 00:20:12.678 "nvme_io": false, 00:20:12.678 "nvme_io_md": false, 00:20:12.678 "write_zeroes": true, 00:20:12.678 "zcopy": true, 00:20:12.678 "get_zone_info": false, 00:20:12.678 "zone_management": false, 00:20:12.678 "zone_append": false, 00:20:12.678 "compare": false, 00:20:12.678 "compare_and_write": false, 00:20:12.678 "abort": true, 00:20:12.678 "seek_hole": false, 00:20:12.678 "seek_data": false, 00:20:12.678 "copy": true, 00:20:12.678 "nvme_iov_md": false 00:20:12.678 }, 00:20:12.678 "memory_domains": [ 00:20:12.678 { 00:20:12.678 "dma_device_id": "system", 00:20:12.678 "dma_device_type": 1 00:20:12.678 }, 00:20:12.678 { 00:20:12.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:12.678 "dma_device_type": 2 00:20:12.678 } 00:20:12.678 ], 00:20:12.678 "driver_specific": {} 00:20:12.678 } 00:20:12.678 ] 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.678 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.679 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.679 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.679 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:12.937 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.937 "name": "Existed_Raid", 00:20:12.937 "uuid": "966a7f99-2203-482a-98d8-50ee181f39f0", 00:20:12.937 "strip_size_kb": 64, 00:20:12.937 "state": "online", 00:20:12.937 "raid_level": "concat", 00:20:12.937 "superblock": true, 00:20:12.937 "num_base_bdevs": 4, 00:20:12.937 "num_base_bdevs_discovered": 4, 00:20:12.937 "num_base_bdevs_operational": 4, 00:20:12.937 "base_bdevs_list": [ 00:20:12.937 { 00:20:12.937 "name": "BaseBdev1", 00:20:12.937 "uuid": "398790e6-8047-4c5a-a66e-899e9dfc5a38", 00:20:12.937 "is_configured": true, 00:20:12.937 "data_offset": 2048, 00:20:12.937 "data_size": 63488 00:20:12.937 }, 00:20:12.937 { 00:20:12.937 "name": "BaseBdev2", 00:20:12.937 "uuid": "b2a5408d-b1b5-46d9-9bb2-258fde0dcb37", 00:20:12.937 "is_configured": true, 00:20:12.937 "data_offset": 2048, 00:20:12.937 "data_size": 63488 00:20:12.937 }, 00:20:12.937 { 00:20:12.937 "name": "BaseBdev3", 00:20:12.937 "uuid": "0adc11aa-37db-4dff-80b5-b9ec10cfc525", 00:20:12.937 "is_configured": true, 00:20:12.937 "data_offset": 2048, 00:20:12.937 "data_size": 63488 00:20:12.937 }, 00:20:12.937 { 00:20:12.937 "name": "BaseBdev4", 00:20:12.937 "uuid": "a4c9377e-9bf0-4af6-a1e7-4d6930de1164", 00:20:12.937 "is_configured": true, 00:20:12.937 "data_offset": 2048, 00:20:12.937 "data_size": 63488 00:20:12.937 } 00:20:12.937 ] 00:20:12.937 }' 00:20:12.937 00:14:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.937 00:14:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:13.504 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:13.504 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:13.504 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:13.504 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:13.504 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:13.504 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:13.504 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:13.504 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:13.763 [2024-07-16 00:15:00.526122] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:13.763 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:13.763 "name": "Existed_Raid", 00:20:13.763 "aliases": [ 00:20:13.763 "966a7f99-2203-482a-98d8-50ee181f39f0" 00:20:13.763 ], 00:20:13.763 "product_name": "Raid Volume", 00:20:13.763 "block_size": 512, 00:20:13.763 "num_blocks": 253952, 00:20:13.763 "uuid": "966a7f99-2203-482a-98d8-50ee181f39f0", 00:20:13.763 "assigned_rate_limits": { 00:20:13.763 "rw_ios_per_sec": 0, 00:20:13.763 "rw_mbytes_per_sec": 0, 00:20:13.763 "r_mbytes_per_sec": 0, 00:20:13.763 "w_mbytes_per_sec": 0 00:20:13.763 }, 00:20:13.763 "claimed": false, 00:20:13.763 "zoned": false, 00:20:13.763 "supported_io_types": { 00:20:13.763 "read": true, 00:20:13.763 "write": true, 00:20:13.763 "unmap": true, 00:20:13.763 "flush": true, 00:20:13.763 "reset": true, 00:20:13.763 "nvme_admin": false, 00:20:13.763 "nvme_io": false, 00:20:13.763 "nvme_io_md": false, 00:20:13.763 "write_zeroes": true, 00:20:13.763 "zcopy": false, 00:20:13.763 "get_zone_info": false, 00:20:13.763 "zone_management": false, 00:20:13.763 "zone_append": false, 00:20:13.763 "compare": false, 00:20:13.763 "compare_and_write": false, 00:20:13.763 "abort": false, 00:20:13.763 "seek_hole": false, 00:20:13.763 "seek_data": false, 00:20:13.763 "copy": false, 00:20:13.763 "nvme_iov_md": false 00:20:13.763 }, 00:20:13.763 "memory_domains": [ 00:20:13.763 { 00:20:13.763 "dma_device_id": "system", 00:20:13.763 "dma_device_type": 1 00:20:13.763 }, 00:20:13.763 { 00:20:13.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.763 "dma_device_type": 2 00:20:13.763 }, 00:20:13.763 { 00:20:13.763 "dma_device_id": "system", 00:20:13.763 "dma_device_type": 1 00:20:13.763 }, 00:20:13.763 { 00:20:13.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.763 "dma_device_type": 2 00:20:13.763 }, 00:20:13.763 { 00:20:13.763 "dma_device_id": "system", 00:20:13.763 "dma_device_type": 1 00:20:13.763 }, 00:20:13.763 { 00:20:13.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.763 "dma_device_type": 2 00:20:13.763 }, 00:20:13.763 { 00:20:13.763 "dma_device_id": "system", 00:20:13.763 "dma_device_type": 1 00:20:13.763 }, 00:20:13.763 { 00:20:13.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.763 "dma_device_type": 2 00:20:13.763 } 00:20:13.763 ], 00:20:13.763 "driver_specific": { 00:20:13.763 "raid": { 00:20:13.763 "uuid": "966a7f99-2203-482a-98d8-50ee181f39f0", 00:20:13.763 "strip_size_kb": 64, 00:20:13.763 "state": "online", 00:20:13.763 "raid_level": "concat", 00:20:13.763 "superblock": true, 00:20:13.763 "num_base_bdevs": 4, 00:20:13.763 "num_base_bdevs_discovered": 4, 00:20:13.763 "num_base_bdevs_operational": 4, 00:20:13.763 "base_bdevs_list": [ 00:20:13.763 { 00:20:13.763 "name": "BaseBdev1", 00:20:13.763 "uuid": "398790e6-8047-4c5a-a66e-899e9dfc5a38", 00:20:13.763 "is_configured": true, 00:20:13.763 "data_offset": 2048, 00:20:13.763 "data_size": 63488 00:20:13.763 }, 00:20:13.763 { 00:20:13.763 "name": "BaseBdev2", 00:20:13.763 "uuid": "b2a5408d-b1b5-46d9-9bb2-258fde0dcb37", 00:20:13.763 "is_configured": true, 00:20:13.763 "data_offset": 2048, 00:20:13.763 "data_size": 63488 00:20:13.763 }, 00:20:13.763 { 00:20:13.763 "name": "BaseBdev3", 00:20:13.763 "uuid": "0adc11aa-37db-4dff-80b5-b9ec10cfc525", 00:20:13.763 "is_configured": true, 00:20:13.763 "data_offset": 2048, 00:20:13.763 "data_size": 63488 00:20:13.763 }, 00:20:13.763 { 00:20:13.763 "name": "BaseBdev4", 00:20:13.763 "uuid": "a4c9377e-9bf0-4af6-a1e7-4d6930de1164", 00:20:13.763 "is_configured": true, 00:20:13.763 "data_offset": 2048, 00:20:13.763 "data_size": 63488 00:20:13.763 } 00:20:13.763 ] 00:20:13.763 } 00:20:13.763 } 00:20:13.763 }' 00:20:13.763 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:13.763 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:13.763 BaseBdev2 00:20:13.763 BaseBdev3 00:20:13.763 BaseBdev4' 00:20:13.763 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:13.763 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:13.763 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:14.042 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:14.042 "name": "BaseBdev1", 00:20:14.042 "aliases": [ 00:20:14.042 "398790e6-8047-4c5a-a66e-899e9dfc5a38" 00:20:14.042 ], 00:20:14.042 "product_name": "Malloc disk", 00:20:14.042 "block_size": 512, 00:20:14.042 "num_blocks": 65536, 00:20:14.042 "uuid": "398790e6-8047-4c5a-a66e-899e9dfc5a38", 00:20:14.042 "assigned_rate_limits": { 00:20:14.042 "rw_ios_per_sec": 0, 00:20:14.043 "rw_mbytes_per_sec": 0, 00:20:14.043 "r_mbytes_per_sec": 0, 00:20:14.043 "w_mbytes_per_sec": 0 00:20:14.043 }, 00:20:14.043 "claimed": true, 00:20:14.043 "claim_type": "exclusive_write", 00:20:14.043 "zoned": false, 00:20:14.043 "supported_io_types": { 00:20:14.043 "read": true, 00:20:14.043 "write": true, 00:20:14.043 "unmap": true, 00:20:14.043 "flush": true, 00:20:14.043 "reset": true, 00:20:14.043 "nvme_admin": false, 00:20:14.043 "nvme_io": false, 00:20:14.043 "nvme_io_md": false, 00:20:14.043 "write_zeroes": true, 00:20:14.043 "zcopy": true, 00:20:14.043 "get_zone_info": false, 00:20:14.043 "zone_management": false, 00:20:14.043 "zone_append": false, 00:20:14.043 "compare": false, 00:20:14.043 "compare_and_write": false, 00:20:14.043 "abort": true, 00:20:14.043 "seek_hole": false, 00:20:14.043 "seek_data": false, 00:20:14.043 "copy": true, 00:20:14.043 "nvme_iov_md": false 00:20:14.043 }, 00:20:14.043 "memory_domains": [ 00:20:14.043 { 00:20:14.043 "dma_device_id": "system", 00:20:14.043 "dma_device_type": 1 00:20:14.043 }, 00:20:14.043 { 00:20:14.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.043 "dma_device_type": 2 00:20:14.043 } 00:20:14.043 ], 00:20:14.043 "driver_specific": {} 00:20:14.043 }' 00:20:14.043 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.043 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.043 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:14.043 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.043 00:15:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.300 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:14.300 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.300 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.300 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:14.300 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.300 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.300 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:14.300 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:14.300 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:14.300 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:14.559 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:14.559 "name": "BaseBdev2", 00:20:14.559 "aliases": [ 00:20:14.559 "b2a5408d-b1b5-46d9-9bb2-258fde0dcb37" 00:20:14.559 ], 00:20:14.559 "product_name": "Malloc disk", 00:20:14.559 "block_size": 512, 00:20:14.559 "num_blocks": 65536, 00:20:14.559 "uuid": "b2a5408d-b1b5-46d9-9bb2-258fde0dcb37", 00:20:14.559 "assigned_rate_limits": { 00:20:14.559 "rw_ios_per_sec": 0, 00:20:14.559 "rw_mbytes_per_sec": 0, 00:20:14.559 "r_mbytes_per_sec": 0, 00:20:14.559 "w_mbytes_per_sec": 0 00:20:14.559 }, 00:20:14.559 "claimed": true, 00:20:14.559 "claim_type": "exclusive_write", 00:20:14.559 "zoned": false, 00:20:14.559 "supported_io_types": { 00:20:14.559 "read": true, 00:20:14.559 "write": true, 00:20:14.559 "unmap": true, 00:20:14.559 "flush": true, 00:20:14.559 "reset": true, 00:20:14.559 "nvme_admin": false, 00:20:14.559 "nvme_io": false, 00:20:14.559 "nvme_io_md": false, 00:20:14.559 "write_zeroes": true, 00:20:14.559 "zcopy": true, 00:20:14.559 "get_zone_info": false, 00:20:14.559 "zone_management": false, 00:20:14.559 "zone_append": false, 00:20:14.559 "compare": false, 00:20:14.559 "compare_and_write": false, 00:20:14.559 "abort": true, 00:20:14.559 "seek_hole": false, 00:20:14.559 "seek_data": false, 00:20:14.559 "copy": true, 00:20:14.559 "nvme_iov_md": false 00:20:14.559 }, 00:20:14.559 "memory_domains": [ 00:20:14.559 { 00:20:14.559 "dma_device_id": "system", 00:20:14.559 "dma_device_type": 1 00:20:14.559 }, 00:20:14.559 { 00:20:14.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.559 "dma_device_type": 2 00:20:14.559 } 00:20:14.559 ], 00:20:14.559 "driver_specific": {} 00:20:14.559 }' 00:20:14.559 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.559 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.817 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:14.817 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.817 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.817 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:14.817 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.817 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.817 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:14.817 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.817 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.075 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:15.075 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:15.075 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:15.075 00:15:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:15.333 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:15.334 "name": "BaseBdev3", 00:20:15.334 "aliases": [ 00:20:15.334 "0adc11aa-37db-4dff-80b5-b9ec10cfc525" 00:20:15.334 ], 00:20:15.334 "product_name": "Malloc disk", 00:20:15.334 "block_size": 512, 00:20:15.334 "num_blocks": 65536, 00:20:15.334 "uuid": "0adc11aa-37db-4dff-80b5-b9ec10cfc525", 00:20:15.334 "assigned_rate_limits": { 00:20:15.334 "rw_ios_per_sec": 0, 00:20:15.334 "rw_mbytes_per_sec": 0, 00:20:15.334 "r_mbytes_per_sec": 0, 00:20:15.334 "w_mbytes_per_sec": 0 00:20:15.334 }, 00:20:15.334 "claimed": true, 00:20:15.334 "claim_type": "exclusive_write", 00:20:15.334 "zoned": false, 00:20:15.334 "supported_io_types": { 00:20:15.334 "read": true, 00:20:15.334 "write": true, 00:20:15.334 "unmap": true, 00:20:15.334 "flush": true, 00:20:15.334 "reset": true, 00:20:15.334 "nvme_admin": false, 00:20:15.334 "nvme_io": false, 00:20:15.334 "nvme_io_md": false, 00:20:15.334 "write_zeroes": true, 00:20:15.334 "zcopy": true, 00:20:15.334 "get_zone_info": false, 00:20:15.334 "zone_management": false, 00:20:15.334 "zone_append": false, 00:20:15.334 "compare": false, 00:20:15.334 "compare_and_write": false, 00:20:15.334 "abort": true, 00:20:15.334 "seek_hole": false, 00:20:15.334 "seek_data": false, 00:20:15.334 "copy": true, 00:20:15.334 "nvme_iov_md": false 00:20:15.334 }, 00:20:15.334 "memory_domains": [ 00:20:15.334 { 00:20:15.334 "dma_device_id": "system", 00:20:15.334 "dma_device_type": 1 00:20:15.334 }, 00:20:15.334 { 00:20:15.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.334 "dma_device_type": 2 00:20:15.334 } 00:20:15.334 ], 00:20:15.334 "driver_specific": {} 00:20:15.334 }' 00:20:15.334 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.334 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.334 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.334 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.334 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.334 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:15.334 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.334 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.593 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:15.593 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.593 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.593 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:15.593 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:15.593 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:15.593 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:15.852 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:15.852 "name": "BaseBdev4", 00:20:15.852 "aliases": [ 00:20:15.852 "a4c9377e-9bf0-4af6-a1e7-4d6930de1164" 00:20:15.852 ], 00:20:15.852 "product_name": "Malloc disk", 00:20:15.852 "block_size": 512, 00:20:15.852 "num_blocks": 65536, 00:20:15.852 "uuid": "a4c9377e-9bf0-4af6-a1e7-4d6930de1164", 00:20:15.852 "assigned_rate_limits": { 00:20:15.852 "rw_ios_per_sec": 0, 00:20:15.852 "rw_mbytes_per_sec": 0, 00:20:15.852 "r_mbytes_per_sec": 0, 00:20:15.852 "w_mbytes_per_sec": 0 00:20:15.852 }, 00:20:15.852 "claimed": true, 00:20:15.852 "claim_type": "exclusive_write", 00:20:15.852 "zoned": false, 00:20:15.852 "supported_io_types": { 00:20:15.852 "read": true, 00:20:15.852 "write": true, 00:20:15.852 "unmap": true, 00:20:15.852 "flush": true, 00:20:15.852 "reset": true, 00:20:15.852 "nvme_admin": false, 00:20:15.852 "nvme_io": false, 00:20:15.852 "nvme_io_md": false, 00:20:15.852 "write_zeroes": true, 00:20:15.852 "zcopy": true, 00:20:15.852 "get_zone_info": false, 00:20:15.852 "zone_management": false, 00:20:15.852 "zone_append": false, 00:20:15.852 "compare": false, 00:20:15.852 "compare_and_write": false, 00:20:15.852 "abort": true, 00:20:15.852 "seek_hole": false, 00:20:15.852 "seek_data": false, 00:20:15.852 "copy": true, 00:20:15.852 "nvme_iov_md": false 00:20:15.852 }, 00:20:15.852 "memory_domains": [ 00:20:15.852 { 00:20:15.852 "dma_device_id": "system", 00:20:15.852 "dma_device_type": 1 00:20:15.852 }, 00:20:15.852 { 00:20:15.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.852 "dma_device_type": 2 00:20:15.852 } 00:20:15.852 ], 00:20:15.852 "driver_specific": {} 00:20:15.852 }' 00:20:15.852 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.852 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.852 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.852 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.852 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.852 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:15.852 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.852 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.111 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:16.111 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.111 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.111 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:16.111 00:15:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:16.369 [2024-07-16 00:15:03.148858] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:16.369 [2024-07-16 00:15:03.148882] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:16.369 [2024-07-16 00:15:03.148933] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.369 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:16.627 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:16.627 "name": "Existed_Raid", 00:20:16.627 "uuid": "966a7f99-2203-482a-98d8-50ee181f39f0", 00:20:16.627 "strip_size_kb": 64, 00:20:16.627 "state": "offline", 00:20:16.627 "raid_level": "concat", 00:20:16.627 "superblock": true, 00:20:16.627 "num_base_bdevs": 4, 00:20:16.627 "num_base_bdevs_discovered": 3, 00:20:16.627 "num_base_bdevs_operational": 3, 00:20:16.627 "base_bdevs_list": [ 00:20:16.627 { 00:20:16.627 "name": null, 00:20:16.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.627 "is_configured": false, 00:20:16.627 "data_offset": 2048, 00:20:16.627 "data_size": 63488 00:20:16.627 }, 00:20:16.627 { 00:20:16.627 "name": "BaseBdev2", 00:20:16.627 "uuid": "b2a5408d-b1b5-46d9-9bb2-258fde0dcb37", 00:20:16.627 "is_configured": true, 00:20:16.627 "data_offset": 2048, 00:20:16.627 "data_size": 63488 00:20:16.627 }, 00:20:16.627 { 00:20:16.627 "name": "BaseBdev3", 00:20:16.627 "uuid": "0adc11aa-37db-4dff-80b5-b9ec10cfc525", 00:20:16.627 "is_configured": true, 00:20:16.627 "data_offset": 2048, 00:20:16.627 "data_size": 63488 00:20:16.627 }, 00:20:16.627 { 00:20:16.627 "name": "BaseBdev4", 00:20:16.627 "uuid": "a4c9377e-9bf0-4af6-a1e7-4d6930de1164", 00:20:16.627 "is_configured": true, 00:20:16.627 "data_offset": 2048, 00:20:16.627 "data_size": 63488 00:20:16.627 } 00:20:16.627 ] 00:20:16.627 }' 00:20:16.627 00:15:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:16.627 00:15:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:17.192 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:17.192 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:17.192 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:17.192 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.450 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:17.450 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:17.450 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:17.708 [2024-07-16 00:15:04.533604] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:17.708 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:17.708 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:17.708 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.708 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:17.966 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:17.966 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:17.966 00:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:18.533 [2024-07-16 00:15:05.315807] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:18.533 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:18.533 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:18.533 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.533 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:18.792 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:18.792 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:18.792 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:19.051 [2024-07-16 00:15:05.837642] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:19.051 [2024-07-16 00:15:05.837682] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c9350 name Existed_Raid, state offline 00:20:19.051 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:19.051 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:19.051 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.051 00:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:19.309 00:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:19.309 00:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:19.309 00:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:19.309 00:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:19.309 00:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:19.309 00:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:19.568 BaseBdev2 00:20:19.568 00:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:19.568 00:15:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:19.568 00:15:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:19.568 00:15:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:19.568 00:15:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:19.568 00:15:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:19.568 00:15:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:19.827 00:15:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:20.086 [ 00:20:20.086 { 00:20:20.086 "name": "BaseBdev2", 00:20:20.086 "aliases": [ 00:20:20.086 "14477e15-5480-40cd-aecf-1922edf9b77a" 00:20:20.086 ], 00:20:20.086 "product_name": "Malloc disk", 00:20:20.086 "block_size": 512, 00:20:20.086 "num_blocks": 65536, 00:20:20.086 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:20.086 "assigned_rate_limits": { 00:20:20.086 "rw_ios_per_sec": 0, 00:20:20.086 "rw_mbytes_per_sec": 0, 00:20:20.086 "r_mbytes_per_sec": 0, 00:20:20.086 "w_mbytes_per_sec": 0 00:20:20.086 }, 00:20:20.086 "claimed": false, 00:20:20.086 "zoned": false, 00:20:20.086 "supported_io_types": { 00:20:20.086 "read": true, 00:20:20.086 "write": true, 00:20:20.086 "unmap": true, 00:20:20.086 "flush": true, 00:20:20.086 "reset": true, 00:20:20.086 "nvme_admin": false, 00:20:20.086 "nvme_io": false, 00:20:20.086 "nvme_io_md": false, 00:20:20.086 "write_zeroes": true, 00:20:20.086 "zcopy": true, 00:20:20.086 "get_zone_info": false, 00:20:20.086 "zone_management": false, 00:20:20.086 "zone_append": false, 00:20:20.086 "compare": false, 00:20:20.086 "compare_and_write": false, 00:20:20.086 "abort": true, 00:20:20.086 "seek_hole": false, 00:20:20.086 "seek_data": false, 00:20:20.086 "copy": true, 00:20:20.086 "nvme_iov_md": false 00:20:20.086 }, 00:20:20.086 "memory_domains": [ 00:20:20.086 { 00:20:20.086 "dma_device_id": "system", 00:20:20.086 "dma_device_type": 1 00:20:20.086 }, 00:20:20.086 { 00:20:20.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:20.086 "dma_device_type": 2 00:20:20.086 } 00:20:20.086 ], 00:20:20.086 "driver_specific": {} 00:20:20.086 } 00:20:20.086 ] 00:20:20.086 00:15:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:20.086 00:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:20.086 00:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:20.086 00:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:20.344 BaseBdev3 00:20:20.344 00:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:20.344 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:20.344 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:20.344 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:20.344 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:20.344 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:20.344 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:20.602 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:20.861 [ 00:20:20.861 { 00:20:20.861 "name": "BaseBdev3", 00:20:20.861 "aliases": [ 00:20:20.861 "c5d113f2-6826-4915-9ce2-97a82e3a50bf" 00:20:20.861 ], 00:20:20.861 "product_name": "Malloc disk", 00:20:20.861 "block_size": 512, 00:20:20.861 "num_blocks": 65536, 00:20:20.861 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:20.861 "assigned_rate_limits": { 00:20:20.861 "rw_ios_per_sec": 0, 00:20:20.861 "rw_mbytes_per_sec": 0, 00:20:20.861 "r_mbytes_per_sec": 0, 00:20:20.861 "w_mbytes_per_sec": 0 00:20:20.861 }, 00:20:20.861 "claimed": false, 00:20:20.861 "zoned": false, 00:20:20.861 "supported_io_types": { 00:20:20.861 "read": true, 00:20:20.861 "write": true, 00:20:20.861 "unmap": true, 00:20:20.861 "flush": true, 00:20:20.861 "reset": true, 00:20:20.861 "nvme_admin": false, 00:20:20.861 "nvme_io": false, 00:20:20.861 "nvme_io_md": false, 00:20:20.861 "write_zeroes": true, 00:20:20.861 "zcopy": true, 00:20:20.861 "get_zone_info": false, 00:20:20.861 "zone_management": false, 00:20:20.862 "zone_append": false, 00:20:20.862 "compare": false, 00:20:20.862 "compare_and_write": false, 00:20:20.862 "abort": true, 00:20:20.862 "seek_hole": false, 00:20:20.862 "seek_data": false, 00:20:20.862 "copy": true, 00:20:20.862 "nvme_iov_md": false 00:20:20.862 }, 00:20:20.862 "memory_domains": [ 00:20:20.862 { 00:20:20.862 "dma_device_id": "system", 00:20:20.862 "dma_device_type": 1 00:20:20.862 }, 00:20:20.862 { 00:20:20.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:20.862 "dma_device_type": 2 00:20:20.862 } 00:20:20.862 ], 00:20:20.862 "driver_specific": {} 00:20:20.862 } 00:20:20.862 ] 00:20:20.862 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:20.862 00:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:20.862 00:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:20.862 00:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:21.120 BaseBdev4 00:20:21.120 00:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:21.120 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:21.120 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:21.120 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:21.120 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:21.120 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:21.120 00:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:21.379 00:15:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:21.379 [ 00:20:21.379 { 00:20:21.379 "name": "BaseBdev4", 00:20:21.379 "aliases": [ 00:20:21.379 "903bb458-5d6b-4597-8159-8395f10151b3" 00:20:21.379 ], 00:20:21.379 "product_name": "Malloc disk", 00:20:21.379 "block_size": 512, 00:20:21.379 "num_blocks": 65536, 00:20:21.379 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:21.379 "assigned_rate_limits": { 00:20:21.379 "rw_ios_per_sec": 0, 00:20:21.379 "rw_mbytes_per_sec": 0, 00:20:21.379 "r_mbytes_per_sec": 0, 00:20:21.379 "w_mbytes_per_sec": 0 00:20:21.379 }, 00:20:21.379 "claimed": false, 00:20:21.379 "zoned": false, 00:20:21.379 "supported_io_types": { 00:20:21.379 "read": true, 00:20:21.379 "write": true, 00:20:21.379 "unmap": true, 00:20:21.379 "flush": true, 00:20:21.379 "reset": true, 00:20:21.379 "nvme_admin": false, 00:20:21.379 "nvme_io": false, 00:20:21.379 "nvme_io_md": false, 00:20:21.379 "write_zeroes": true, 00:20:21.379 "zcopy": true, 00:20:21.379 "get_zone_info": false, 00:20:21.379 "zone_management": false, 00:20:21.379 "zone_append": false, 00:20:21.379 "compare": false, 00:20:21.379 "compare_and_write": false, 00:20:21.379 "abort": true, 00:20:21.379 "seek_hole": false, 00:20:21.379 "seek_data": false, 00:20:21.379 "copy": true, 00:20:21.379 "nvme_iov_md": false 00:20:21.379 }, 00:20:21.379 "memory_domains": [ 00:20:21.379 { 00:20:21.379 "dma_device_id": "system", 00:20:21.379 "dma_device_type": 1 00:20:21.379 }, 00:20:21.379 { 00:20:21.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.379 "dma_device_type": 2 00:20:21.379 } 00:20:21.379 ], 00:20:21.379 "driver_specific": {} 00:20:21.379 } 00:20:21.379 ] 00:20:21.379 00:15:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:21.379 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:21.379 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:21.637 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:21.637 [2024-07-16 00:15:08.498890] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:21.637 [2024-07-16 00:15:08.498936] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:21.637 [2024-07-16 00:15:08.498955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:21.637 [2024-07-16 00:15:08.500260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:21.637 [2024-07-16 00:15:08.500303] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:21.637 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:21.638 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.896 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.896 "name": "Existed_Raid", 00:20:21.896 "uuid": "9f0af3e0-929c-4b35-bfb6-8492844a5ca6", 00:20:21.896 "strip_size_kb": 64, 00:20:21.896 "state": "configuring", 00:20:21.896 "raid_level": "concat", 00:20:21.896 "superblock": true, 00:20:21.896 "num_base_bdevs": 4, 00:20:21.896 "num_base_bdevs_discovered": 3, 00:20:21.896 "num_base_bdevs_operational": 4, 00:20:21.896 "base_bdevs_list": [ 00:20:21.896 { 00:20:21.896 "name": "BaseBdev1", 00:20:21.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:21.896 "is_configured": false, 00:20:21.896 "data_offset": 0, 00:20:21.896 "data_size": 0 00:20:21.896 }, 00:20:21.896 { 00:20:21.896 "name": "BaseBdev2", 00:20:21.896 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:21.896 "is_configured": true, 00:20:21.896 "data_offset": 2048, 00:20:21.896 "data_size": 63488 00:20:21.896 }, 00:20:21.896 { 00:20:21.896 "name": "BaseBdev3", 00:20:21.896 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:21.896 "is_configured": true, 00:20:21.896 "data_offset": 2048, 00:20:21.896 "data_size": 63488 00:20:21.896 }, 00:20:21.896 { 00:20:21.896 "name": "BaseBdev4", 00:20:21.896 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:21.896 "is_configured": true, 00:20:21.896 "data_offset": 2048, 00:20:21.896 "data_size": 63488 00:20:21.896 } 00:20:21.896 ] 00:20:21.896 }' 00:20:21.896 00:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.896 00:15:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:22.463 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:22.722 [2024-07-16 00:15:09.481449] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.722 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:22.980 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.980 "name": "Existed_Raid", 00:20:22.980 "uuid": "9f0af3e0-929c-4b35-bfb6-8492844a5ca6", 00:20:22.980 "strip_size_kb": 64, 00:20:22.980 "state": "configuring", 00:20:22.980 "raid_level": "concat", 00:20:22.980 "superblock": true, 00:20:22.980 "num_base_bdevs": 4, 00:20:22.980 "num_base_bdevs_discovered": 2, 00:20:22.980 "num_base_bdevs_operational": 4, 00:20:22.980 "base_bdevs_list": [ 00:20:22.980 { 00:20:22.980 "name": "BaseBdev1", 00:20:22.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.980 "is_configured": false, 00:20:22.980 "data_offset": 0, 00:20:22.980 "data_size": 0 00:20:22.980 }, 00:20:22.980 { 00:20:22.980 "name": null, 00:20:22.980 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:22.980 "is_configured": false, 00:20:22.980 "data_offset": 2048, 00:20:22.980 "data_size": 63488 00:20:22.980 }, 00:20:22.980 { 00:20:22.980 "name": "BaseBdev3", 00:20:22.980 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:22.980 "is_configured": true, 00:20:22.980 "data_offset": 2048, 00:20:22.980 "data_size": 63488 00:20:22.980 }, 00:20:22.980 { 00:20:22.980 "name": "BaseBdev4", 00:20:22.980 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:22.980 "is_configured": true, 00:20:22.980 "data_offset": 2048, 00:20:22.980 "data_size": 63488 00:20:22.980 } 00:20:22.980 ] 00:20:22.980 }' 00:20:22.980 00:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.980 00:15:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:23.546 00:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:23.546 00:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.804 00:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:23.804 00:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:24.062 [2024-07-16 00:15:10.829558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:24.062 BaseBdev1 00:20:24.062 00:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:24.062 00:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:24.062 00:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:24.062 00:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:24.062 00:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:24.062 00:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:24.062 00:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:24.321 00:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:24.581 [ 00:20:24.581 { 00:20:24.581 "name": "BaseBdev1", 00:20:24.581 "aliases": [ 00:20:24.581 "ac1a40b7-ab55-449e-a62b-647d4be51819" 00:20:24.581 ], 00:20:24.581 "product_name": "Malloc disk", 00:20:24.581 "block_size": 512, 00:20:24.581 "num_blocks": 65536, 00:20:24.581 "uuid": "ac1a40b7-ab55-449e-a62b-647d4be51819", 00:20:24.581 "assigned_rate_limits": { 00:20:24.581 "rw_ios_per_sec": 0, 00:20:24.581 "rw_mbytes_per_sec": 0, 00:20:24.581 "r_mbytes_per_sec": 0, 00:20:24.581 "w_mbytes_per_sec": 0 00:20:24.581 }, 00:20:24.581 "claimed": true, 00:20:24.581 "claim_type": "exclusive_write", 00:20:24.581 "zoned": false, 00:20:24.581 "supported_io_types": { 00:20:24.581 "read": true, 00:20:24.581 "write": true, 00:20:24.581 "unmap": true, 00:20:24.581 "flush": true, 00:20:24.581 "reset": true, 00:20:24.581 "nvme_admin": false, 00:20:24.581 "nvme_io": false, 00:20:24.581 "nvme_io_md": false, 00:20:24.581 "write_zeroes": true, 00:20:24.581 "zcopy": true, 00:20:24.581 "get_zone_info": false, 00:20:24.581 "zone_management": false, 00:20:24.581 "zone_append": false, 00:20:24.581 "compare": false, 00:20:24.581 "compare_and_write": false, 00:20:24.581 "abort": true, 00:20:24.581 "seek_hole": false, 00:20:24.581 "seek_data": false, 00:20:24.581 "copy": true, 00:20:24.581 "nvme_iov_md": false 00:20:24.581 }, 00:20:24.581 "memory_domains": [ 00:20:24.581 { 00:20:24.581 "dma_device_id": "system", 00:20:24.581 "dma_device_type": 1 00:20:24.581 }, 00:20:24.581 { 00:20:24.581 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.581 "dma_device_type": 2 00:20:24.581 } 00:20:24.581 ], 00:20:24.581 "driver_specific": {} 00:20:24.581 } 00:20:24.581 ] 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:24.581 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.841 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.841 "name": "Existed_Raid", 00:20:24.841 "uuid": "9f0af3e0-929c-4b35-bfb6-8492844a5ca6", 00:20:24.841 "strip_size_kb": 64, 00:20:24.841 "state": "configuring", 00:20:24.841 "raid_level": "concat", 00:20:24.841 "superblock": true, 00:20:24.841 "num_base_bdevs": 4, 00:20:24.841 "num_base_bdevs_discovered": 3, 00:20:24.841 "num_base_bdevs_operational": 4, 00:20:24.841 "base_bdevs_list": [ 00:20:24.841 { 00:20:24.841 "name": "BaseBdev1", 00:20:24.841 "uuid": "ac1a40b7-ab55-449e-a62b-647d4be51819", 00:20:24.841 "is_configured": true, 00:20:24.841 "data_offset": 2048, 00:20:24.841 "data_size": 63488 00:20:24.841 }, 00:20:24.841 { 00:20:24.841 "name": null, 00:20:24.841 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:24.841 "is_configured": false, 00:20:24.841 "data_offset": 2048, 00:20:24.841 "data_size": 63488 00:20:24.841 }, 00:20:24.841 { 00:20:24.841 "name": "BaseBdev3", 00:20:24.841 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:24.841 "is_configured": true, 00:20:24.841 "data_offset": 2048, 00:20:24.841 "data_size": 63488 00:20:24.841 }, 00:20:24.841 { 00:20:24.841 "name": "BaseBdev4", 00:20:24.841 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:24.841 "is_configured": true, 00:20:24.841 "data_offset": 2048, 00:20:24.841 "data_size": 63488 00:20:24.841 } 00:20:24.841 ] 00:20:24.841 }' 00:20:24.841 00:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.841 00:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:25.408 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.408 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:25.666 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:25.666 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:25.923 [2024-07-16 00:15:12.698574] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.923 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:26.181 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.181 "name": "Existed_Raid", 00:20:26.181 "uuid": "9f0af3e0-929c-4b35-bfb6-8492844a5ca6", 00:20:26.181 "strip_size_kb": 64, 00:20:26.181 "state": "configuring", 00:20:26.181 "raid_level": "concat", 00:20:26.181 "superblock": true, 00:20:26.181 "num_base_bdevs": 4, 00:20:26.181 "num_base_bdevs_discovered": 2, 00:20:26.181 "num_base_bdevs_operational": 4, 00:20:26.181 "base_bdevs_list": [ 00:20:26.181 { 00:20:26.181 "name": "BaseBdev1", 00:20:26.181 "uuid": "ac1a40b7-ab55-449e-a62b-647d4be51819", 00:20:26.181 "is_configured": true, 00:20:26.181 "data_offset": 2048, 00:20:26.181 "data_size": 63488 00:20:26.181 }, 00:20:26.181 { 00:20:26.181 "name": null, 00:20:26.181 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:26.181 "is_configured": false, 00:20:26.181 "data_offset": 2048, 00:20:26.181 "data_size": 63488 00:20:26.181 }, 00:20:26.181 { 00:20:26.181 "name": null, 00:20:26.181 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:26.181 "is_configured": false, 00:20:26.181 "data_offset": 2048, 00:20:26.181 "data_size": 63488 00:20:26.181 }, 00:20:26.181 { 00:20:26.181 "name": "BaseBdev4", 00:20:26.181 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:26.181 "is_configured": true, 00:20:26.181 "data_offset": 2048, 00:20:26.181 "data_size": 63488 00:20:26.181 } 00:20:26.181 ] 00:20:26.181 }' 00:20:26.181 00:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.181 00:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:26.745 00:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.745 00:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:27.002 00:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:27.002 00:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:27.260 [2024-07-16 00:15:14.094306] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.260 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:27.517 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:27.517 "name": "Existed_Raid", 00:20:27.517 "uuid": "9f0af3e0-929c-4b35-bfb6-8492844a5ca6", 00:20:27.517 "strip_size_kb": 64, 00:20:27.517 "state": "configuring", 00:20:27.517 "raid_level": "concat", 00:20:27.517 "superblock": true, 00:20:27.517 "num_base_bdevs": 4, 00:20:27.517 "num_base_bdevs_discovered": 3, 00:20:27.517 "num_base_bdevs_operational": 4, 00:20:27.517 "base_bdevs_list": [ 00:20:27.517 { 00:20:27.517 "name": "BaseBdev1", 00:20:27.517 "uuid": "ac1a40b7-ab55-449e-a62b-647d4be51819", 00:20:27.517 "is_configured": true, 00:20:27.517 "data_offset": 2048, 00:20:27.517 "data_size": 63488 00:20:27.517 }, 00:20:27.517 { 00:20:27.517 "name": null, 00:20:27.517 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:27.517 "is_configured": false, 00:20:27.517 "data_offset": 2048, 00:20:27.517 "data_size": 63488 00:20:27.517 }, 00:20:27.517 { 00:20:27.517 "name": "BaseBdev3", 00:20:27.517 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:27.517 "is_configured": true, 00:20:27.517 "data_offset": 2048, 00:20:27.517 "data_size": 63488 00:20:27.517 }, 00:20:27.517 { 00:20:27.517 "name": "BaseBdev4", 00:20:27.517 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:27.517 "is_configured": true, 00:20:27.517 "data_offset": 2048, 00:20:27.517 "data_size": 63488 00:20:27.517 } 00:20:27.517 ] 00:20:27.517 }' 00:20:27.517 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:27.517 00:15:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:28.116 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.116 00:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:28.393 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:28.393 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:28.958 [2024-07-16 00:15:15.690551] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.958 00:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:29.525 00:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.525 "name": "Existed_Raid", 00:20:29.525 "uuid": "9f0af3e0-929c-4b35-bfb6-8492844a5ca6", 00:20:29.525 "strip_size_kb": 64, 00:20:29.525 "state": "configuring", 00:20:29.525 "raid_level": "concat", 00:20:29.525 "superblock": true, 00:20:29.525 "num_base_bdevs": 4, 00:20:29.525 "num_base_bdevs_discovered": 2, 00:20:29.525 "num_base_bdevs_operational": 4, 00:20:29.525 "base_bdevs_list": [ 00:20:29.525 { 00:20:29.525 "name": null, 00:20:29.525 "uuid": "ac1a40b7-ab55-449e-a62b-647d4be51819", 00:20:29.525 "is_configured": false, 00:20:29.525 "data_offset": 2048, 00:20:29.525 "data_size": 63488 00:20:29.525 }, 00:20:29.525 { 00:20:29.525 "name": null, 00:20:29.525 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:29.525 "is_configured": false, 00:20:29.525 "data_offset": 2048, 00:20:29.525 "data_size": 63488 00:20:29.525 }, 00:20:29.525 { 00:20:29.525 "name": "BaseBdev3", 00:20:29.525 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:29.525 "is_configured": true, 00:20:29.525 "data_offset": 2048, 00:20:29.525 "data_size": 63488 00:20:29.525 }, 00:20:29.525 { 00:20:29.525 "name": "BaseBdev4", 00:20:29.525 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:29.525 "is_configured": true, 00:20:29.525 "data_offset": 2048, 00:20:29.525 "data_size": 63488 00:20:29.525 } 00:20:29.525 ] 00:20:29.525 }' 00:20:29.525 00:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.525 00:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:30.458 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.458 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:30.458 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:30.458 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:31.025 [2024-07-16 00:15:17.756145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:31.025 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:31.025 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:31.025 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:31.025 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:31.025 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:31.025 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:31.025 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.025 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.025 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.026 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.026 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.026 00:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:31.593 00:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.593 "name": "Existed_Raid", 00:20:31.593 "uuid": "9f0af3e0-929c-4b35-bfb6-8492844a5ca6", 00:20:31.593 "strip_size_kb": 64, 00:20:31.593 "state": "configuring", 00:20:31.593 "raid_level": "concat", 00:20:31.593 "superblock": true, 00:20:31.593 "num_base_bdevs": 4, 00:20:31.593 "num_base_bdevs_discovered": 3, 00:20:31.593 "num_base_bdevs_operational": 4, 00:20:31.593 "base_bdevs_list": [ 00:20:31.593 { 00:20:31.593 "name": null, 00:20:31.593 "uuid": "ac1a40b7-ab55-449e-a62b-647d4be51819", 00:20:31.593 "is_configured": false, 00:20:31.593 "data_offset": 2048, 00:20:31.593 "data_size": 63488 00:20:31.593 }, 00:20:31.593 { 00:20:31.593 "name": "BaseBdev2", 00:20:31.593 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:31.593 "is_configured": true, 00:20:31.593 "data_offset": 2048, 00:20:31.593 "data_size": 63488 00:20:31.593 }, 00:20:31.593 { 00:20:31.593 "name": "BaseBdev3", 00:20:31.593 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:31.593 "is_configured": true, 00:20:31.593 "data_offset": 2048, 00:20:31.593 "data_size": 63488 00:20:31.593 }, 00:20:31.593 { 00:20:31.593 "name": "BaseBdev4", 00:20:31.593 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:31.593 "is_configured": true, 00:20:31.593 "data_offset": 2048, 00:20:31.593 "data_size": 63488 00:20:31.593 } 00:20:31.593 ] 00:20:31.593 }' 00:20:31.593 00:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.593 00:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:32.530 00:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.530 00:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:32.530 00:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:32.530 00:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.530 00:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:33.098 00:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ac1a40b7-ab55-449e-a62b-647d4be51819 00:20:33.358 [2024-07-16 00:15:20.073762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:33.358 [2024-07-16 00:15:20.073916] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13cb850 00:20:33.358 [2024-07-16 00:15:20.073941] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:33.358 [2024-07-16 00:15:20.074115] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c1d80 00:20:33.358 [2024-07-16 00:15:20.074230] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13cb850 00:20:33.358 [2024-07-16 00:15:20.074240] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13cb850 00:20:33.358 [2024-07-16 00:15:20.074326] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:33.358 NewBaseBdev 00:20:33.358 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:33.358 00:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:33.358 00:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:33.358 00:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:33.358 00:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:33.358 00:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:33.358 00:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:33.616 00:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:33.876 [ 00:20:33.876 { 00:20:33.876 "name": "NewBaseBdev", 00:20:33.876 "aliases": [ 00:20:33.876 "ac1a40b7-ab55-449e-a62b-647d4be51819" 00:20:33.876 ], 00:20:33.876 "product_name": "Malloc disk", 00:20:33.876 "block_size": 512, 00:20:33.876 "num_blocks": 65536, 00:20:33.876 "uuid": "ac1a40b7-ab55-449e-a62b-647d4be51819", 00:20:33.876 "assigned_rate_limits": { 00:20:33.876 "rw_ios_per_sec": 0, 00:20:33.876 "rw_mbytes_per_sec": 0, 00:20:33.876 "r_mbytes_per_sec": 0, 00:20:33.876 "w_mbytes_per_sec": 0 00:20:33.876 }, 00:20:33.876 "claimed": true, 00:20:33.876 "claim_type": "exclusive_write", 00:20:33.876 "zoned": false, 00:20:33.876 "supported_io_types": { 00:20:33.876 "read": true, 00:20:33.876 "write": true, 00:20:33.876 "unmap": true, 00:20:33.876 "flush": true, 00:20:33.876 "reset": true, 00:20:33.876 "nvme_admin": false, 00:20:33.876 "nvme_io": false, 00:20:33.876 "nvme_io_md": false, 00:20:33.876 "write_zeroes": true, 00:20:33.876 "zcopy": true, 00:20:33.876 "get_zone_info": false, 00:20:33.876 "zone_management": false, 00:20:33.876 "zone_append": false, 00:20:33.876 "compare": false, 00:20:33.876 "compare_and_write": false, 00:20:33.876 "abort": true, 00:20:33.876 "seek_hole": false, 00:20:33.876 "seek_data": false, 00:20:33.876 "copy": true, 00:20:33.876 "nvme_iov_md": false 00:20:33.876 }, 00:20:33.876 "memory_domains": [ 00:20:33.876 { 00:20:33.876 "dma_device_id": "system", 00:20:33.876 "dma_device_type": 1 00:20:33.876 }, 00:20:33.876 { 00:20:33.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:33.876 "dma_device_type": 2 00:20:33.876 } 00:20:33.876 ], 00:20:33.876 "driver_specific": {} 00:20:33.876 } 00:20:33.876 ] 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.876 00:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:34.444 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.444 "name": "Existed_Raid", 00:20:34.444 "uuid": "9f0af3e0-929c-4b35-bfb6-8492844a5ca6", 00:20:34.444 "strip_size_kb": 64, 00:20:34.444 "state": "online", 00:20:34.444 "raid_level": "concat", 00:20:34.444 "superblock": true, 00:20:34.444 "num_base_bdevs": 4, 00:20:34.444 "num_base_bdevs_discovered": 4, 00:20:34.444 "num_base_bdevs_operational": 4, 00:20:34.444 "base_bdevs_list": [ 00:20:34.444 { 00:20:34.444 "name": "NewBaseBdev", 00:20:34.444 "uuid": "ac1a40b7-ab55-449e-a62b-647d4be51819", 00:20:34.444 "is_configured": true, 00:20:34.444 "data_offset": 2048, 00:20:34.444 "data_size": 63488 00:20:34.444 }, 00:20:34.444 { 00:20:34.444 "name": "BaseBdev2", 00:20:34.444 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:34.444 "is_configured": true, 00:20:34.444 "data_offset": 2048, 00:20:34.444 "data_size": 63488 00:20:34.444 }, 00:20:34.444 { 00:20:34.444 "name": "BaseBdev3", 00:20:34.444 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:34.444 "is_configured": true, 00:20:34.444 "data_offset": 2048, 00:20:34.444 "data_size": 63488 00:20:34.444 }, 00:20:34.444 { 00:20:34.444 "name": "BaseBdev4", 00:20:34.444 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:34.444 "is_configured": true, 00:20:34.444 "data_offset": 2048, 00:20:34.444 "data_size": 63488 00:20:34.444 } 00:20:34.444 ] 00:20:34.444 }' 00:20:34.444 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.444 00:15:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:35.012 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:35.012 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:35.012 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:35.012 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:35.012 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:35.012 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:35.012 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:35.012 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:35.012 [2024-07-16 00:15:21.955085] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:35.271 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:35.271 "name": "Existed_Raid", 00:20:35.271 "aliases": [ 00:20:35.271 "9f0af3e0-929c-4b35-bfb6-8492844a5ca6" 00:20:35.271 ], 00:20:35.271 "product_name": "Raid Volume", 00:20:35.271 "block_size": 512, 00:20:35.271 "num_blocks": 253952, 00:20:35.271 "uuid": "9f0af3e0-929c-4b35-bfb6-8492844a5ca6", 00:20:35.271 "assigned_rate_limits": { 00:20:35.271 "rw_ios_per_sec": 0, 00:20:35.271 "rw_mbytes_per_sec": 0, 00:20:35.271 "r_mbytes_per_sec": 0, 00:20:35.271 "w_mbytes_per_sec": 0 00:20:35.271 }, 00:20:35.271 "claimed": false, 00:20:35.271 "zoned": false, 00:20:35.271 "supported_io_types": { 00:20:35.271 "read": true, 00:20:35.271 "write": true, 00:20:35.271 "unmap": true, 00:20:35.271 "flush": true, 00:20:35.271 "reset": true, 00:20:35.271 "nvme_admin": false, 00:20:35.271 "nvme_io": false, 00:20:35.271 "nvme_io_md": false, 00:20:35.271 "write_zeroes": true, 00:20:35.271 "zcopy": false, 00:20:35.271 "get_zone_info": false, 00:20:35.271 "zone_management": false, 00:20:35.271 "zone_append": false, 00:20:35.271 "compare": false, 00:20:35.271 "compare_and_write": false, 00:20:35.271 "abort": false, 00:20:35.271 "seek_hole": false, 00:20:35.271 "seek_data": false, 00:20:35.271 "copy": false, 00:20:35.271 "nvme_iov_md": false 00:20:35.271 }, 00:20:35.271 "memory_domains": [ 00:20:35.271 { 00:20:35.271 "dma_device_id": "system", 00:20:35.271 "dma_device_type": 1 00:20:35.271 }, 00:20:35.271 { 00:20:35.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.271 "dma_device_type": 2 00:20:35.271 }, 00:20:35.271 { 00:20:35.271 "dma_device_id": "system", 00:20:35.271 "dma_device_type": 1 00:20:35.271 }, 00:20:35.271 { 00:20:35.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.271 "dma_device_type": 2 00:20:35.271 }, 00:20:35.271 { 00:20:35.271 "dma_device_id": "system", 00:20:35.271 "dma_device_type": 1 00:20:35.271 }, 00:20:35.271 { 00:20:35.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.271 "dma_device_type": 2 00:20:35.271 }, 00:20:35.271 { 00:20:35.271 "dma_device_id": "system", 00:20:35.271 "dma_device_type": 1 00:20:35.271 }, 00:20:35.271 { 00:20:35.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.271 "dma_device_type": 2 00:20:35.271 } 00:20:35.271 ], 00:20:35.271 "driver_specific": { 00:20:35.271 "raid": { 00:20:35.271 "uuid": "9f0af3e0-929c-4b35-bfb6-8492844a5ca6", 00:20:35.271 "strip_size_kb": 64, 00:20:35.271 "state": "online", 00:20:35.271 "raid_level": "concat", 00:20:35.271 "superblock": true, 00:20:35.271 "num_base_bdevs": 4, 00:20:35.271 "num_base_bdevs_discovered": 4, 00:20:35.271 "num_base_bdevs_operational": 4, 00:20:35.271 "base_bdevs_list": [ 00:20:35.271 { 00:20:35.271 "name": "NewBaseBdev", 00:20:35.271 "uuid": "ac1a40b7-ab55-449e-a62b-647d4be51819", 00:20:35.271 "is_configured": true, 00:20:35.271 "data_offset": 2048, 00:20:35.271 "data_size": 63488 00:20:35.271 }, 00:20:35.271 { 00:20:35.271 "name": "BaseBdev2", 00:20:35.271 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:35.271 "is_configured": true, 00:20:35.271 "data_offset": 2048, 00:20:35.271 "data_size": 63488 00:20:35.271 }, 00:20:35.271 { 00:20:35.271 "name": "BaseBdev3", 00:20:35.271 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:35.271 "is_configured": true, 00:20:35.271 "data_offset": 2048, 00:20:35.271 "data_size": 63488 00:20:35.271 }, 00:20:35.271 { 00:20:35.271 "name": "BaseBdev4", 00:20:35.271 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:35.271 "is_configured": true, 00:20:35.271 "data_offset": 2048, 00:20:35.271 "data_size": 63488 00:20:35.271 } 00:20:35.271 ] 00:20:35.271 } 00:20:35.271 } 00:20:35.271 }' 00:20:35.271 00:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:35.271 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:35.271 BaseBdev2 00:20:35.271 BaseBdev3 00:20:35.271 BaseBdev4' 00:20:35.271 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:35.271 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:35.271 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:35.529 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:35.529 "name": "NewBaseBdev", 00:20:35.529 "aliases": [ 00:20:35.529 "ac1a40b7-ab55-449e-a62b-647d4be51819" 00:20:35.529 ], 00:20:35.529 "product_name": "Malloc disk", 00:20:35.529 "block_size": 512, 00:20:35.529 "num_blocks": 65536, 00:20:35.529 "uuid": "ac1a40b7-ab55-449e-a62b-647d4be51819", 00:20:35.529 "assigned_rate_limits": { 00:20:35.529 "rw_ios_per_sec": 0, 00:20:35.529 "rw_mbytes_per_sec": 0, 00:20:35.529 "r_mbytes_per_sec": 0, 00:20:35.529 "w_mbytes_per_sec": 0 00:20:35.529 }, 00:20:35.529 "claimed": true, 00:20:35.529 "claim_type": "exclusive_write", 00:20:35.529 "zoned": false, 00:20:35.529 "supported_io_types": { 00:20:35.529 "read": true, 00:20:35.529 "write": true, 00:20:35.529 "unmap": true, 00:20:35.529 "flush": true, 00:20:35.529 "reset": true, 00:20:35.529 "nvme_admin": false, 00:20:35.529 "nvme_io": false, 00:20:35.529 "nvme_io_md": false, 00:20:35.529 "write_zeroes": true, 00:20:35.529 "zcopy": true, 00:20:35.529 "get_zone_info": false, 00:20:35.529 "zone_management": false, 00:20:35.529 "zone_append": false, 00:20:35.529 "compare": false, 00:20:35.529 "compare_and_write": false, 00:20:35.529 "abort": true, 00:20:35.529 "seek_hole": false, 00:20:35.529 "seek_data": false, 00:20:35.529 "copy": true, 00:20:35.529 "nvme_iov_md": false 00:20:35.529 }, 00:20:35.529 "memory_domains": [ 00:20:35.529 { 00:20:35.529 "dma_device_id": "system", 00:20:35.529 "dma_device_type": 1 00:20:35.529 }, 00:20:35.529 { 00:20:35.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.529 "dma_device_type": 2 00:20:35.529 } 00:20:35.529 ], 00:20:35.529 "driver_specific": {} 00:20:35.529 }' 00:20:35.529 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:35.529 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:35.529 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:35.529 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.529 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.787 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:35.787 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.787 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.787 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:35.787 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.787 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.787 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:35.787 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:35.787 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:35.787 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:36.046 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:36.046 "name": "BaseBdev2", 00:20:36.046 "aliases": [ 00:20:36.046 "14477e15-5480-40cd-aecf-1922edf9b77a" 00:20:36.046 ], 00:20:36.046 "product_name": "Malloc disk", 00:20:36.046 "block_size": 512, 00:20:36.046 "num_blocks": 65536, 00:20:36.047 "uuid": "14477e15-5480-40cd-aecf-1922edf9b77a", 00:20:36.047 "assigned_rate_limits": { 00:20:36.047 "rw_ios_per_sec": 0, 00:20:36.047 "rw_mbytes_per_sec": 0, 00:20:36.047 "r_mbytes_per_sec": 0, 00:20:36.047 "w_mbytes_per_sec": 0 00:20:36.047 }, 00:20:36.047 "claimed": true, 00:20:36.047 "claim_type": "exclusive_write", 00:20:36.047 "zoned": false, 00:20:36.047 "supported_io_types": { 00:20:36.047 "read": true, 00:20:36.047 "write": true, 00:20:36.047 "unmap": true, 00:20:36.047 "flush": true, 00:20:36.047 "reset": true, 00:20:36.047 "nvme_admin": false, 00:20:36.047 "nvme_io": false, 00:20:36.047 "nvme_io_md": false, 00:20:36.047 "write_zeroes": true, 00:20:36.047 "zcopy": true, 00:20:36.047 "get_zone_info": false, 00:20:36.047 "zone_management": false, 00:20:36.047 "zone_append": false, 00:20:36.047 "compare": false, 00:20:36.047 "compare_and_write": false, 00:20:36.047 "abort": true, 00:20:36.047 "seek_hole": false, 00:20:36.047 "seek_data": false, 00:20:36.047 "copy": true, 00:20:36.047 "nvme_iov_md": false 00:20:36.047 }, 00:20:36.047 "memory_domains": [ 00:20:36.047 { 00:20:36.047 "dma_device_id": "system", 00:20:36.047 "dma_device_type": 1 00:20:36.047 }, 00:20:36.047 { 00:20:36.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.047 "dma_device_type": 2 00:20:36.047 } 00:20:36.047 ], 00:20:36.047 "driver_specific": {} 00:20:36.047 }' 00:20:36.047 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:36.047 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:36.047 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:36.047 00:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:36.306 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:36.306 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:36.306 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:36.306 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:36.306 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:36.306 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:36.306 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:36.565 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:36.565 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:36.565 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:36.565 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:37.132 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:37.132 "name": "BaseBdev3", 00:20:37.132 "aliases": [ 00:20:37.132 "c5d113f2-6826-4915-9ce2-97a82e3a50bf" 00:20:37.132 ], 00:20:37.132 "product_name": "Malloc disk", 00:20:37.132 "block_size": 512, 00:20:37.132 "num_blocks": 65536, 00:20:37.132 "uuid": "c5d113f2-6826-4915-9ce2-97a82e3a50bf", 00:20:37.132 "assigned_rate_limits": { 00:20:37.132 "rw_ios_per_sec": 0, 00:20:37.132 "rw_mbytes_per_sec": 0, 00:20:37.132 "r_mbytes_per_sec": 0, 00:20:37.132 "w_mbytes_per_sec": 0 00:20:37.132 }, 00:20:37.132 "claimed": true, 00:20:37.132 "claim_type": "exclusive_write", 00:20:37.132 "zoned": false, 00:20:37.132 "supported_io_types": { 00:20:37.132 "read": true, 00:20:37.132 "write": true, 00:20:37.132 "unmap": true, 00:20:37.132 "flush": true, 00:20:37.132 "reset": true, 00:20:37.132 "nvme_admin": false, 00:20:37.132 "nvme_io": false, 00:20:37.132 "nvme_io_md": false, 00:20:37.132 "write_zeroes": true, 00:20:37.132 "zcopy": true, 00:20:37.132 "get_zone_info": false, 00:20:37.132 "zone_management": false, 00:20:37.132 "zone_append": false, 00:20:37.132 "compare": false, 00:20:37.132 "compare_and_write": false, 00:20:37.132 "abort": true, 00:20:37.132 "seek_hole": false, 00:20:37.132 "seek_data": false, 00:20:37.132 "copy": true, 00:20:37.132 "nvme_iov_md": false 00:20:37.132 }, 00:20:37.132 "memory_domains": [ 00:20:37.132 { 00:20:37.132 "dma_device_id": "system", 00:20:37.132 "dma_device_type": 1 00:20:37.132 }, 00:20:37.132 { 00:20:37.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:37.132 "dma_device_type": 2 00:20:37.132 } 00:20:37.132 ], 00:20:37.132 "driver_specific": {} 00:20:37.133 }' 00:20:37.133 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:37.133 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:37.133 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:37.133 00:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:37.133 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:37.391 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:37.391 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:37.391 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:37.391 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:37.391 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:37.391 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:37.650 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:37.650 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:37.650 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:37.650 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:38.216 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:38.216 "name": "BaseBdev4", 00:20:38.216 "aliases": [ 00:20:38.216 "903bb458-5d6b-4597-8159-8395f10151b3" 00:20:38.216 ], 00:20:38.216 "product_name": "Malloc disk", 00:20:38.216 "block_size": 512, 00:20:38.216 "num_blocks": 65536, 00:20:38.216 "uuid": "903bb458-5d6b-4597-8159-8395f10151b3", 00:20:38.216 "assigned_rate_limits": { 00:20:38.216 "rw_ios_per_sec": 0, 00:20:38.216 "rw_mbytes_per_sec": 0, 00:20:38.216 "r_mbytes_per_sec": 0, 00:20:38.216 "w_mbytes_per_sec": 0 00:20:38.216 }, 00:20:38.216 "claimed": true, 00:20:38.216 "claim_type": "exclusive_write", 00:20:38.216 "zoned": false, 00:20:38.216 "supported_io_types": { 00:20:38.216 "read": true, 00:20:38.216 "write": true, 00:20:38.216 "unmap": true, 00:20:38.216 "flush": true, 00:20:38.216 "reset": true, 00:20:38.216 "nvme_admin": false, 00:20:38.216 "nvme_io": false, 00:20:38.216 "nvme_io_md": false, 00:20:38.216 "write_zeroes": true, 00:20:38.216 "zcopy": true, 00:20:38.216 "get_zone_info": false, 00:20:38.216 "zone_management": false, 00:20:38.216 "zone_append": false, 00:20:38.216 "compare": false, 00:20:38.216 "compare_and_write": false, 00:20:38.216 "abort": true, 00:20:38.216 "seek_hole": false, 00:20:38.216 "seek_data": false, 00:20:38.216 "copy": true, 00:20:38.216 "nvme_iov_md": false 00:20:38.216 }, 00:20:38.216 "memory_domains": [ 00:20:38.216 { 00:20:38.216 "dma_device_id": "system", 00:20:38.216 "dma_device_type": 1 00:20:38.216 }, 00:20:38.216 { 00:20:38.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:38.216 "dma_device_type": 2 00:20:38.216 } 00:20:38.216 ], 00:20:38.216 "driver_specific": {} 00:20:38.216 }' 00:20:38.216 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:38.216 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:38.216 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:38.216 00:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:38.216 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:38.216 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:38.216 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:38.216 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:38.474 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:38.474 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:38.474 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:38.474 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:38.474 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:38.733 [2024-07-16 00:15:25.560342] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:38.733 [2024-07-16 00:15:25.560367] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:38.733 [2024-07-16 00:15:25.560414] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:38.733 [2024-07-16 00:15:25.560473] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:38.733 [2024-07-16 00:15:25.560485] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13cb850 name Existed_Raid, state offline 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3570975 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3570975 ']' 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3570975 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3570975 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3570975' 00:20:38.733 killing process with pid 3570975 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3570975 00:20:38.733 [2024-07-16 00:15:25.648457] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:38.733 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3570975 00:20:38.991 [2024-07-16 00:15:25.690983] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:38.991 00:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:38.991 00:20:38.991 real 0m36.637s 00:20:38.991 user 1m7.415s 00:20:38.991 sys 0m6.362s 00:20:38.991 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:38.991 00:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:38.991 ************************************ 00:20:38.991 END TEST raid_state_function_test_sb 00:20:38.991 ************************************ 00:20:39.249 00:15:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:39.249 00:15:25 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:20:39.249 00:15:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:20:39.250 00:15:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:39.250 00:15:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:39.250 ************************************ 00:20:39.250 START TEST raid_superblock_test 00:20:39.250 ************************************ 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3576865 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3576865 /var/tmp/spdk-raid.sock 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3576865 ']' 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:39.250 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:39.250 00:15:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:39.250 [2024-07-16 00:15:26.069183] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:20:39.250 [2024-07-16 00:15:26.069254] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3576865 ] 00:20:39.508 [2024-07-16 00:15:26.202281] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:39.508 [2024-07-16 00:15:26.305453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:39.508 [2024-07-16 00:15:26.372111] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:39.508 [2024-07-16 00:15:26.372151] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:40.075 00:15:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:40.075 00:15:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:20:40.075 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:40.075 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:40.075 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:40.075 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:40.076 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:40.076 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:40.076 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:40.076 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:40.076 00:15:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:40.335 malloc1 00:20:40.335 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:40.594 [2024-07-16 00:15:27.406511] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:40.594 [2024-07-16 00:15:27.406561] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:40.594 [2024-07-16 00:15:27.406581] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2609570 00:20:40.594 [2024-07-16 00:15:27.406594] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:40.594 [2024-07-16 00:15:27.408149] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:40.594 [2024-07-16 00:15:27.408193] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:40.594 pt1 00:20:40.594 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:40.594 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:40.594 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:40.594 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:40.594 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:40.594 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:40.594 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:40.594 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:40.594 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:40.853 malloc2 00:20:40.853 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:41.112 [2024-07-16 00:15:27.908664] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:41.112 [2024-07-16 00:15:27.908707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:41.112 [2024-07-16 00:15:27.908725] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x260a970 00:20:41.112 [2024-07-16 00:15:27.908737] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:41.112 [2024-07-16 00:15:27.910219] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:41.112 [2024-07-16 00:15:27.910248] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:41.112 pt2 00:20:41.112 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:41.112 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:41.112 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:41.112 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:41.112 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:41.112 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:41.112 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:41.112 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:41.112 00:15:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:41.371 malloc3 00:20:41.371 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:41.630 [2024-07-16 00:15:28.410514] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:41.630 [2024-07-16 00:15:28.410559] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:41.630 [2024-07-16 00:15:28.410577] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27a1340 00:20:41.630 [2024-07-16 00:15:28.410590] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:41.630 [2024-07-16 00:15:28.412028] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:41.630 [2024-07-16 00:15:28.412055] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:41.630 pt3 00:20:41.630 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:41.630 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:41.630 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:20:41.630 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:20:41.630 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:41.630 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:41.630 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:41.630 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:41.630 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:41.889 malloc4 00:20:41.889 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:42.149 [2024-07-16 00:15:28.896453] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:42.149 [2024-07-16 00:15:28.896501] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:42.149 [2024-07-16 00:15:28.896520] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27a3c60 00:20:42.149 [2024-07-16 00:15:28.896537] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:42.149 [2024-07-16 00:15:28.897936] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:42.149 [2024-07-16 00:15:28.897970] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:42.149 pt4 00:20:42.149 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:42.149 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:42.149 00:15:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:42.149 [2024-07-16 00:15:29.088998] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:42.149 [2024-07-16 00:15:29.090196] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:42.149 [2024-07-16 00:15:29.090251] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:42.149 [2024-07-16 00:15:29.090295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:42.149 [2024-07-16 00:15:29.090458] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2601530 00:20:42.149 [2024-07-16 00:15:29.090469] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:42.149 [2024-07-16 00:15:29.090658] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25ff770 00:20:42.149 [2024-07-16 00:15:29.090797] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2601530 00:20:42.149 [2024-07-16 00:15:29.090807] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2601530 00:20:42.149 [2024-07-16 00:15:29.090897] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:42.408 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.409 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.670 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.670 "name": "raid_bdev1", 00:20:42.670 "uuid": "381d4c37-0188-4d75-a10a-d6fa52444426", 00:20:42.670 "strip_size_kb": 64, 00:20:42.670 "state": "online", 00:20:42.670 "raid_level": "concat", 00:20:42.670 "superblock": true, 00:20:42.670 "num_base_bdevs": 4, 00:20:42.670 "num_base_bdevs_discovered": 4, 00:20:42.670 "num_base_bdevs_operational": 4, 00:20:42.670 "base_bdevs_list": [ 00:20:42.670 { 00:20:42.670 "name": "pt1", 00:20:42.670 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:42.670 "is_configured": true, 00:20:42.670 "data_offset": 2048, 00:20:42.670 "data_size": 63488 00:20:42.670 }, 00:20:42.670 { 00:20:42.670 "name": "pt2", 00:20:42.670 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:42.670 "is_configured": true, 00:20:42.670 "data_offset": 2048, 00:20:42.670 "data_size": 63488 00:20:42.670 }, 00:20:42.670 { 00:20:42.670 "name": "pt3", 00:20:42.670 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:42.670 "is_configured": true, 00:20:42.670 "data_offset": 2048, 00:20:42.670 "data_size": 63488 00:20:42.670 }, 00:20:42.670 { 00:20:42.670 "name": "pt4", 00:20:42.670 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:42.670 "is_configured": true, 00:20:42.670 "data_offset": 2048, 00:20:42.670 "data_size": 63488 00:20:42.670 } 00:20:42.670 ] 00:20:42.670 }' 00:20:42.670 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.670 00:15:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:43.304 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:43.304 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:43.304 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:43.304 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:43.304 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:43.304 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:43.304 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:43.304 00:15:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:43.304 [2024-07-16 00:15:30.204245] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:43.304 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:43.304 "name": "raid_bdev1", 00:20:43.304 "aliases": [ 00:20:43.304 "381d4c37-0188-4d75-a10a-d6fa52444426" 00:20:43.304 ], 00:20:43.304 "product_name": "Raid Volume", 00:20:43.304 "block_size": 512, 00:20:43.304 "num_blocks": 253952, 00:20:43.304 "uuid": "381d4c37-0188-4d75-a10a-d6fa52444426", 00:20:43.304 "assigned_rate_limits": { 00:20:43.304 "rw_ios_per_sec": 0, 00:20:43.304 "rw_mbytes_per_sec": 0, 00:20:43.304 "r_mbytes_per_sec": 0, 00:20:43.304 "w_mbytes_per_sec": 0 00:20:43.304 }, 00:20:43.304 "claimed": false, 00:20:43.304 "zoned": false, 00:20:43.304 "supported_io_types": { 00:20:43.304 "read": true, 00:20:43.304 "write": true, 00:20:43.304 "unmap": true, 00:20:43.304 "flush": true, 00:20:43.304 "reset": true, 00:20:43.304 "nvme_admin": false, 00:20:43.304 "nvme_io": false, 00:20:43.304 "nvme_io_md": false, 00:20:43.304 "write_zeroes": true, 00:20:43.304 "zcopy": false, 00:20:43.304 "get_zone_info": false, 00:20:43.304 "zone_management": false, 00:20:43.304 "zone_append": false, 00:20:43.304 "compare": false, 00:20:43.304 "compare_and_write": false, 00:20:43.304 "abort": false, 00:20:43.304 "seek_hole": false, 00:20:43.304 "seek_data": false, 00:20:43.304 "copy": false, 00:20:43.304 "nvme_iov_md": false 00:20:43.304 }, 00:20:43.304 "memory_domains": [ 00:20:43.304 { 00:20:43.304 "dma_device_id": "system", 00:20:43.304 "dma_device_type": 1 00:20:43.304 }, 00:20:43.304 { 00:20:43.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.304 "dma_device_type": 2 00:20:43.304 }, 00:20:43.304 { 00:20:43.304 "dma_device_id": "system", 00:20:43.304 "dma_device_type": 1 00:20:43.304 }, 00:20:43.304 { 00:20:43.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.304 "dma_device_type": 2 00:20:43.304 }, 00:20:43.304 { 00:20:43.304 "dma_device_id": "system", 00:20:43.304 "dma_device_type": 1 00:20:43.304 }, 00:20:43.304 { 00:20:43.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.304 "dma_device_type": 2 00:20:43.304 }, 00:20:43.304 { 00:20:43.304 "dma_device_id": "system", 00:20:43.304 "dma_device_type": 1 00:20:43.304 }, 00:20:43.304 { 00:20:43.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.304 "dma_device_type": 2 00:20:43.304 } 00:20:43.304 ], 00:20:43.304 "driver_specific": { 00:20:43.304 "raid": { 00:20:43.304 "uuid": "381d4c37-0188-4d75-a10a-d6fa52444426", 00:20:43.304 "strip_size_kb": 64, 00:20:43.304 "state": "online", 00:20:43.304 "raid_level": "concat", 00:20:43.304 "superblock": true, 00:20:43.304 "num_base_bdevs": 4, 00:20:43.304 "num_base_bdevs_discovered": 4, 00:20:43.304 "num_base_bdevs_operational": 4, 00:20:43.304 "base_bdevs_list": [ 00:20:43.304 { 00:20:43.304 "name": "pt1", 00:20:43.304 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:43.304 "is_configured": true, 00:20:43.304 "data_offset": 2048, 00:20:43.304 "data_size": 63488 00:20:43.305 }, 00:20:43.305 { 00:20:43.305 "name": "pt2", 00:20:43.305 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:43.305 "is_configured": true, 00:20:43.305 "data_offset": 2048, 00:20:43.305 "data_size": 63488 00:20:43.305 }, 00:20:43.305 { 00:20:43.305 "name": "pt3", 00:20:43.305 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:43.305 "is_configured": true, 00:20:43.305 "data_offset": 2048, 00:20:43.305 "data_size": 63488 00:20:43.305 }, 00:20:43.305 { 00:20:43.305 "name": "pt4", 00:20:43.305 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:43.305 "is_configured": true, 00:20:43.305 "data_offset": 2048, 00:20:43.305 "data_size": 63488 00:20:43.305 } 00:20:43.305 ] 00:20:43.305 } 00:20:43.305 } 00:20:43.305 }' 00:20:43.305 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:43.563 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:43.563 pt2 00:20:43.563 pt3 00:20:43.563 pt4' 00:20:43.563 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:43.563 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:43.563 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:43.822 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:43.822 "name": "pt1", 00:20:43.822 "aliases": [ 00:20:43.822 "00000000-0000-0000-0000-000000000001" 00:20:43.822 ], 00:20:43.822 "product_name": "passthru", 00:20:43.822 "block_size": 512, 00:20:43.822 "num_blocks": 65536, 00:20:43.822 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:43.822 "assigned_rate_limits": { 00:20:43.822 "rw_ios_per_sec": 0, 00:20:43.823 "rw_mbytes_per_sec": 0, 00:20:43.823 "r_mbytes_per_sec": 0, 00:20:43.823 "w_mbytes_per_sec": 0 00:20:43.823 }, 00:20:43.823 "claimed": true, 00:20:43.823 "claim_type": "exclusive_write", 00:20:43.823 "zoned": false, 00:20:43.823 "supported_io_types": { 00:20:43.823 "read": true, 00:20:43.823 "write": true, 00:20:43.823 "unmap": true, 00:20:43.823 "flush": true, 00:20:43.823 "reset": true, 00:20:43.823 "nvme_admin": false, 00:20:43.823 "nvme_io": false, 00:20:43.823 "nvme_io_md": false, 00:20:43.823 "write_zeroes": true, 00:20:43.823 "zcopy": true, 00:20:43.823 "get_zone_info": false, 00:20:43.823 "zone_management": false, 00:20:43.823 "zone_append": false, 00:20:43.823 "compare": false, 00:20:43.823 "compare_and_write": false, 00:20:43.823 "abort": true, 00:20:43.823 "seek_hole": false, 00:20:43.823 "seek_data": false, 00:20:43.823 "copy": true, 00:20:43.823 "nvme_iov_md": false 00:20:43.823 }, 00:20:43.823 "memory_domains": [ 00:20:43.823 { 00:20:43.823 "dma_device_id": "system", 00:20:43.823 "dma_device_type": 1 00:20:43.823 }, 00:20:43.823 { 00:20:43.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.823 "dma_device_type": 2 00:20:43.823 } 00:20:43.823 ], 00:20:43.823 "driver_specific": { 00:20:43.823 "passthru": { 00:20:43.823 "name": "pt1", 00:20:43.823 "base_bdev_name": "malloc1" 00:20:43.823 } 00:20:43.823 } 00:20:43.823 }' 00:20:43.823 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.823 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.823 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:43.823 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.823 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:44.082 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:44.082 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:44.082 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:44.082 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:44.082 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:44.082 00:15:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:44.341 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:44.341 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:44.341 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:44.341 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:44.910 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:44.910 "name": "pt2", 00:20:44.910 "aliases": [ 00:20:44.910 "00000000-0000-0000-0000-000000000002" 00:20:44.910 ], 00:20:44.910 "product_name": "passthru", 00:20:44.910 "block_size": 512, 00:20:44.910 "num_blocks": 65536, 00:20:44.910 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:44.910 "assigned_rate_limits": { 00:20:44.910 "rw_ios_per_sec": 0, 00:20:44.910 "rw_mbytes_per_sec": 0, 00:20:44.910 "r_mbytes_per_sec": 0, 00:20:44.910 "w_mbytes_per_sec": 0 00:20:44.910 }, 00:20:44.910 "claimed": true, 00:20:44.910 "claim_type": "exclusive_write", 00:20:44.910 "zoned": false, 00:20:44.910 "supported_io_types": { 00:20:44.910 "read": true, 00:20:44.910 "write": true, 00:20:44.910 "unmap": true, 00:20:44.910 "flush": true, 00:20:44.910 "reset": true, 00:20:44.910 "nvme_admin": false, 00:20:44.910 "nvme_io": false, 00:20:44.910 "nvme_io_md": false, 00:20:44.910 "write_zeroes": true, 00:20:44.910 "zcopy": true, 00:20:44.910 "get_zone_info": false, 00:20:44.910 "zone_management": false, 00:20:44.910 "zone_append": false, 00:20:44.910 "compare": false, 00:20:44.910 "compare_and_write": false, 00:20:44.910 "abort": true, 00:20:44.910 "seek_hole": false, 00:20:44.910 "seek_data": false, 00:20:44.910 "copy": true, 00:20:44.910 "nvme_iov_md": false 00:20:44.910 }, 00:20:44.910 "memory_domains": [ 00:20:44.910 { 00:20:44.910 "dma_device_id": "system", 00:20:44.910 "dma_device_type": 1 00:20:44.910 }, 00:20:44.910 { 00:20:44.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:44.910 "dma_device_type": 2 00:20:44.910 } 00:20:44.910 ], 00:20:44.910 "driver_specific": { 00:20:44.910 "passthru": { 00:20:44.910 "name": "pt2", 00:20:44.910 "base_bdev_name": "malloc2" 00:20:44.910 } 00:20:44.910 } 00:20:44.910 }' 00:20:44.910 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:44.910 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:44.910 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:44.910 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:44.910 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:44.910 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:44.910 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:44.910 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:45.170 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:45.170 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:45.170 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:45.170 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:45.170 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:45.170 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:45.170 00:15:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:45.430 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:45.430 "name": "pt3", 00:20:45.430 "aliases": [ 00:20:45.430 "00000000-0000-0000-0000-000000000003" 00:20:45.430 ], 00:20:45.430 "product_name": "passthru", 00:20:45.430 "block_size": 512, 00:20:45.430 "num_blocks": 65536, 00:20:45.430 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:45.430 "assigned_rate_limits": { 00:20:45.430 "rw_ios_per_sec": 0, 00:20:45.430 "rw_mbytes_per_sec": 0, 00:20:45.430 "r_mbytes_per_sec": 0, 00:20:45.430 "w_mbytes_per_sec": 0 00:20:45.430 }, 00:20:45.430 "claimed": true, 00:20:45.430 "claim_type": "exclusive_write", 00:20:45.430 "zoned": false, 00:20:45.430 "supported_io_types": { 00:20:45.430 "read": true, 00:20:45.430 "write": true, 00:20:45.430 "unmap": true, 00:20:45.430 "flush": true, 00:20:45.430 "reset": true, 00:20:45.430 "nvme_admin": false, 00:20:45.430 "nvme_io": false, 00:20:45.430 "nvme_io_md": false, 00:20:45.430 "write_zeroes": true, 00:20:45.430 "zcopy": true, 00:20:45.430 "get_zone_info": false, 00:20:45.430 "zone_management": false, 00:20:45.430 "zone_append": false, 00:20:45.430 "compare": false, 00:20:45.430 "compare_and_write": false, 00:20:45.430 "abort": true, 00:20:45.430 "seek_hole": false, 00:20:45.430 "seek_data": false, 00:20:45.430 "copy": true, 00:20:45.430 "nvme_iov_md": false 00:20:45.430 }, 00:20:45.430 "memory_domains": [ 00:20:45.430 { 00:20:45.430 "dma_device_id": "system", 00:20:45.430 "dma_device_type": 1 00:20:45.430 }, 00:20:45.430 { 00:20:45.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:45.430 "dma_device_type": 2 00:20:45.430 } 00:20:45.430 ], 00:20:45.430 "driver_specific": { 00:20:45.430 "passthru": { 00:20:45.430 "name": "pt3", 00:20:45.430 "base_bdev_name": "malloc3" 00:20:45.430 } 00:20:45.430 } 00:20:45.430 }' 00:20:45.430 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:45.430 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:45.430 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:45.430 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:45.689 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:45.689 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:45.689 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:45.689 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:45.689 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:45.689 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:45.689 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:45.949 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:45.949 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:45.949 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:45.949 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:45.949 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:45.949 "name": "pt4", 00:20:45.949 "aliases": [ 00:20:45.949 "00000000-0000-0000-0000-000000000004" 00:20:45.949 ], 00:20:45.949 "product_name": "passthru", 00:20:45.949 "block_size": 512, 00:20:45.949 "num_blocks": 65536, 00:20:45.949 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:45.949 "assigned_rate_limits": { 00:20:45.949 "rw_ios_per_sec": 0, 00:20:45.949 "rw_mbytes_per_sec": 0, 00:20:45.949 "r_mbytes_per_sec": 0, 00:20:45.949 "w_mbytes_per_sec": 0 00:20:45.949 }, 00:20:45.949 "claimed": true, 00:20:45.949 "claim_type": "exclusive_write", 00:20:45.949 "zoned": false, 00:20:45.949 "supported_io_types": { 00:20:45.949 "read": true, 00:20:45.949 "write": true, 00:20:45.949 "unmap": true, 00:20:45.949 "flush": true, 00:20:45.949 "reset": true, 00:20:45.949 "nvme_admin": false, 00:20:45.949 "nvme_io": false, 00:20:45.949 "nvme_io_md": false, 00:20:45.949 "write_zeroes": true, 00:20:45.949 "zcopy": true, 00:20:45.949 "get_zone_info": false, 00:20:45.949 "zone_management": false, 00:20:45.949 "zone_append": false, 00:20:45.949 "compare": false, 00:20:45.949 "compare_and_write": false, 00:20:45.949 "abort": true, 00:20:45.949 "seek_hole": false, 00:20:45.949 "seek_data": false, 00:20:45.949 "copy": true, 00:20:45.949 "nvme_iov_md": false 00:20:45.949 }, 00:20:45.949 "memory_domains": [ 00:20:45.949 { 00:20:45.949 "dma_device_id": "system", 00:20:45.949 "dma_device_type": 1 00:20:45.949 }, 00:20:45.949 { 00:20:45.949 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:45.949 "dma_device_type": 2 00:20:45.949 } 00:20:45.949 ], 00:20:45.949 "driver_specific": { 00:20:45.949 "passthru": { 00:20:45.949 "name": "pt4", 00:20:45.949 "base_bdev_name": "malloc4" 00:20:45.949 } 00:20:45.949 } 00:20:45.949 }' 00:20:45.949 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.209 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.209 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:46.209 00:15:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.209 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.209 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:46.209 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.209 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.470 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:46.470 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:46.470 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:46.470 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:46.470 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:46.470 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:46.470 [2024-07-16 00:15:33.416761] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:46.733 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=381d4c37-0188-4d75-a10a-d6fa52444426 00:20:46.733 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 381d4c37-0188-4d75-a10a-d6fa52444426 ']' 00:20:46.733 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:46.733 [2024-07-16 00:15:33.669107] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:46.733 [2024-07-16 00:15:33.669137] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:46.733 [2024-07-16 00:15:33.669188] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:46.733 [2024-07-16 00:15:33.669250] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:46.733 [2024-07-16 00:15:33.669267] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2601530 name raid_bdev1, state offline 00:20:46.991 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.991 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:47.250 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:47.250 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:47.250 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:47.250 00:15:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:47.509 00:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:47.509 00:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:47.768 00:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:47.768 00:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:47.768 00:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:47.768 00:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:48.027 00:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:48.027 00:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:48.286 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:48.544 [2024-07-16 00:15:35.425697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:48.544 [2024-07-16 00:15:35.427106] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:48.544 [2024-07-16 00:15:35.427151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:48.544 [2024-07-16 00:15:35.427185] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:48.544 [2024-07-16 00:15:35.427233] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:48.544 [2024-07-16 00:15:35.427273] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:48.544 [2024-07-16 00:15:35.427296] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:48.544 [2024-07-16 00:15:35.427318] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:48.544 [2024-07-16 00:15:35.427335] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:48.544 [2024-07-16 00:15:35.427346] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27acff0 name raid_bdev1, state configuring 00:20:48.544 request: 00:20:48.544 { 00:20:48.544 "name": "raid_bdev1", 00:20:48.544 "raid_level": "concat", 00:20:48.544 "base_bdevs": [ 00:20:48.544 "malloc1", 00:20:48.544 "malloc2", 00:20:48.544 "malloc3", 00:20:48.544 "malloc4" 00:20:48.544 ], 00:20:48.544 "strip_size_kb": 64, 00:20:48.544 "superblock": false, 00:20:48.544 "method": "bdev_raid_create", 00:20:48.544 "req_id": 1 00:20:48.544 } 00:20:48.544 Got JSON-RPC error response 00:20:48.544 response: 00:20:48.544 { 00:20:48.544 "code": -17, 00:20:48.544 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:48.544 } 00:20:48.544 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:20:48.544 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:48.544 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:48.544 00:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:48.544 00:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.544 00:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:48.802 00:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:48.802 00:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:48.802 00:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:49.368 [2024-07-16 00:15:36.179610] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:49.368 [2024-07-16 00:15:36.179661] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:49.368 [2024-07-16 00:15:36.179682] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26097a0 00:20:49.369 [2024-07-16 00:15:36.179695] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:49.369 [2024-07-16 00:15:36.181323] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:49.369 [2024-07-16 00:15:36.181353] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:49.369 [2024-07-16 00:15:36.181424] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:49.369 [2024-07-16 00:15:36.181452] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:49.369 pt1 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.369 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:49.627 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:49.627 "name": "raid_bdev1", 00:20:49.627 "uuid": "381d4c37-0188-4d75-a10a-d6fa52444426", 00:20:49.627 "strip_size_kb": 64, 00:20:49.627 "state": "configuring", 00:20:49.627 "raid_level": "concat", 00:20:49.627 "superblock": true, 00:20:49.627 "num_base_bdevs": 4, 00:20:49.627 "num_base_bdevs_discovered": 1, 00:20:49.627 "num_base_bdevs_operational": 4, 00:20:49.627 "base_bdevs_list": [ 00:20:49.627 { 00:20:49.627 "name": "pt1", 00:20:49.627 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:49.627 "is_configured": true, 00:20:49.627 "data_offset": 2048, 00:20:49.627 "data_size": 63488 00:20:49.627 }, 00:20:49.627 { 00:20:49.627 "name": null, 00:20:49.627 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:49.627 "is_configured": false, 00:20:49.627 "data_offset": 2048, 00:20:49.627 "data_size": 63488 00:20:49.627 }, 00:20:49.627 { 00:20:49.627 "name": null, 00:20:49.627 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:49.627 "is_configured": false, 00:20:49.627 "data_offset": 2048, 00:20:49.627 "data_size": 63488 00:20:49.627 }, 00:20:49.627 { 00:20:49.627 "name": null, 00:20:49.627 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:49.627 "is_configured": false, 00:20:49.627 "data_offset": 2048, 00:20:49.627 "data_size": 63488 00:20:49.627 } 00:20:49.627 ] 00:20:49.627 }' 00:20:49.627 00:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:49.627 00:15:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.193 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:20:50.193 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:50.451 [2024-07-16 00:15:37.282565] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:50.451 [2024-07-16 00:15:37.282614] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:50.451 [2024-07-16 00:15:37.282632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2600ea0 00:20:50.451 [2024-07-16 00:15:37.282646] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:50.451 [2024-07-16 00:15:37.282999] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:50.451 [2024-07-16 00:15:37.283017] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:50.451 [2024-07-16 00:15:37.283078] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:50.451 [2024-07-16 00:15:37.283096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:50.451 pt2 00:20:50.451 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:50.710 [2024-07-16 00:15:37.535248] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.710 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.968 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.968 "name": "raid_bdev1", 00:20:50.968 "uuid": "381d4c37-0188-4d75-a10a-d6fa52444426", 00:20:50.968 "strip_size_kb": 64, 00:20:50.968 "state": "configuring", 00:20:50.968 "raid_level": "concat", 00:20:50.968 "superblock": true, 00:20:50.968 "num_base_bdevs": 4, 00:20:50.968 "num_base_bdevs_discovered": 1, 00:20:50.968 "num_base_bdevs_operational": 4, 00:20:50.968 "base_bdevs_list": [ 00:20:50.968 { 00:20:50.968 "name": "pt1", 00:20:50.968 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:50.968 "is_configured": true, 00:20:50.968 "data_offset": 2048, 00:20:50.968 "data_size": 63488 00:20:50.968 }, 00:20:50.968 { 00:20:50.968 "name": null, 00:20:50.968 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:50.968 "is_configured": false, 00:20:50.968 "data_offset": 2048, 00:20:50.968 "data_size": 63488 00:20:50.968 }, 00:20:50.968 { 00:20:50.968 "name": null, 00:20:50.968 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:50.968 "is_configured": false, 00:20:50.968 "data_offset": 2048, 00:20:50.968 "data_size": 63488 00:20:50.968 }, 00:20:50.968 { 00:20:50.968 "name": null, 00:20:50.968 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:50.968 "is_configured": false, 00:20:50.968 "data_offset": 2048, 00:20:50.968 "data_size": 63488 00:20:50.968 } 00:20:50.968 ] 00:20:50.968 }' 00:20:50.968 00:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.968 00:15:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:51.534 00:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:51.534 00:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:51.534 00:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:51.793 [2024-07-16 00:15:38.618125] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:51.793 [2024-07-16 00:15:38.618172] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:51.793 [2024-07-16 00:15:38.618191] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25ffec0 00:20:51.793 [2024-07-16 00:15:38.618203] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:51.793 [2024-07-16 00:15:38.618544] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:51.793 [2024-07-16 00:15:38.618561] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:51.793 [2024-07-16 00:15:38.618621] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:51.793 [2024-07-16 00:15:38.618639] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:51.793 pt2 00:20:51.793 00:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:51.793 00:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:51.793 00:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:52.052 [2024-07-16 00:15:38.870797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:52.052 [2024-07-16 00:15:38.870827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:52.052 [2024-07-16 00:15:38.870842] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26000f0 00:20:52.052 [2024-07-16 00:15:38.870854] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:52.052 [2024-07-16 00:15:38.871146] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:52.052 [2024-07-16 00:15:38.871171] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:52.053 [2024-07-16 00:15:38.871222] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:52.053 [2024-07-16 00:15:38.871240] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:52.053 pt3 00:20:52.053 00:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:52.053 00:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:52.053 00:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:52.311 [2024-07-16 00:15:39.119460] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:52.311 [2024-07-16 00:15:39.119498] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:52.311 [2024-07-16 00:15:39.119515] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2608af0 00:20:52.311 [2024-07-16 00:15:39.119529] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:52.311 [2024-07-16 00:15:39.119824] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:52.311 [2024-07-16 00:15:39.119841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:52.311 [2024-07-16 00:15:39.119894] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:52.311 [2024-07-16 00:15:39.119912] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:52.311 [2024-07-16 00:15:39.120038] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26028f0 00:20:52.311 [2024-07-16 00:15:39.120049] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:52.311 [2024-07-16 00:15:39.120214] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2602150 00:20:52.311 [2024-07-16 00:15:39.120340] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26028f0 00:20:52.311 [2024-07-16 00:15:39.120350] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26028f0 00:20:52.311 [2024-07-16 00:15:39.120446] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:52.311 pt4 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.311 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:52.571 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:52.571 "name": "raid_bdev1", 00:20:52.571 "uuid": "381d4c37-0188-4d75-a10a-d6fa52444426", 00:20:52.571 "strip_size_kb": 64, 00:20:52.571 "state": "online", 00:20:52.571 "raid_level": "concat", 00:20:52.571 "superblock": true, 00:20:52.571 "num_base_bdevs": 4, 00:20:52.571 "num_base_bdevs_discovered": 4, 00:20:52.571 "num_base_bdevs_operational": 4, 00:20:52.571 "base_bdevs_list": [ 00:20:52.571 { 00:20:52.571 "name": "pt1", 00:20:52.571 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:52.571 "is_configured": true, 00:20:52.571 "data_offset": 2048, 00:20:52.571 "data_size": 63488 00:20:52.571 }, 00:20:52.571 { 00:20:52.571 "name": "pt2", 00:20:52.571 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:52.571 "is_configured": true, 00:20:52.571 "data_offset": 2048, 00:20:52.571 "data_size": 63488 00:20:52.571 }, 00:20:52.571 { 00:20:52.571 "name": "pt3", 00:20:52.571 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:52.571 "is_configured": true, 00:20:52.571 "data_offset": 2048, 00:20:52.571 "data_size": 63488 00:20:52.571 }, 00:20:52.571 { 00:20:52.571 "name": "pt4", 00:20:52.571 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:52.571 "is_configured": true, 00:20:52.571 "data_offset": 2048, 00:20:52.571 "data_size": 63488 00:20:52.571 } 00:20:52.571 ] 00:20:52.571 }' 00:20:52.571 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:52.571 00:15:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:53.138 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:53.138 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:53.138 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:53.138 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:53.138 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:53.138 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:53.138 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:53.138 00:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:53.397 [2024-07-16 00:15:40.166689] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:53.397 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:53.397 "name": "raid_bdev1", 00:20:53.397 "aliases": [ 00:20:53.397 "381d4c37-0188-4d75-a10a-d6fa52444426" 00:20:53.397 ], 00:20:53.397 "product_name": "Raid Volume", 00:20:53.397 "block_size": 512, 00:20:53.397 "num_blocks": 253952, 00:20:53.397 "uuid": "381d4c37-0188-4d75-a10a-d6fa52444426", 00:20:53.397 "assigned_rate_limits": { 00:20:53.397 "rw_ios_per_sec": 0, 00:20:53.397 "rw_mbytes_per_sec": 0, 00:20:53.397 "r_mbytes_per_sec": 0, 00:20:53.397 "w_mbytes_per_sec": 0 00:20:53.397 }, 00:20:53.397 "claimed": false, 00:20:53.397 "zoned": false, 00:20:53.397 "supported_io_types": { 00:20:53.397 "read": true, 00:20:53.397 "write": true, 00:20:53.397 "unmap": true, 00:20:53.397 "flush": true, 00:20:53.397 "reset": true, 00:20:53.397 "nvme_admin": false, 00:20:53.397 "nvme_io": false, 00:20:53.397 "nvme_io_md": false, 00:20:53.397 "write_zeroes": true, 00:20:53.397 "zcopy": false, 00:20:53.397 "get_zone_info": false, 00:20:53.397 "zone_management": false, 00:20:53.397 "zone_append": false, 00:20:53.397 "compare": false, 00:20:53.397 "compare_and_write": false, 00:20:53.397 "abort": false, 00:20:53.397 "seek_hole": false, 00:20:53.397 "seek_data": false, 00:20:53.397 "copy": false, 00:20:53.397 "nvme_iov_md": false 00:20:53.397 }, 00:20:53.397 "memory_domains": [ 00:20:53.397 { 00:20:53.397 "dma_device_id": "system", 00:20:53.397 "dma_device_type": 1 00:20:53.397 }, 00:20:53.397 { 00:20:53.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.397 "dma_device_type": 2 00:20:53.397 }, 00:20:53.397 { 00:20:53.397 "dma_device_id": "system", 00:20:53.397 "dma_device_type": 1 00:20:53.397 }, 00:20:53.397 { 00:20:53.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.397 "dma_device_type": 2 00:20:53.397 }, 00:20:53.397 { 00:20:53.397 "dma_device_id": "system", 00:20:53.397 "dma_device_type": 1 00:20:53.397 }, 00:20:53.397 { 00:20:53.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.397 "dma_device_type": 2 00:20:53.397 }, 00:20:53.397 { 00:20:53.397 "dma_device_id": "system", 00:20:53.397 "dma_device_type": 1 00:20:53.397 }, 00:20:53.397 { 00:20:53.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.397 "dma_device_type": 2 00:20:53.397 } 00:20:53.397 ], 00:20:53.397 "driver_specific": { 00:20:53.397 "raid": { 00:20:53.397 "uuid": "381d4c37-0188-4d75-a10a-d6fa52444426", 00:20:53.397 "strip_size_kb": 64, 00:20:53.397 "state": "online", 00:20:53.397 "raid_level": "concat", 00:20:53.397 "superblock": true, 00:20:53.397 "num_base_bdevs": 4, 00:20:53.397 "num_base_bdevs_discovered": 4, 00:20:53.397 "num_base_bdevs_operational": 4, 00:20:53.397 "base_bdevs_list": [ 00:20:53.397 { 00:20:53.397 "name": "pt1", 00:20:53.397 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:53.397 "is_configured": true, 00:20:53.397 "data_offset": 2048, 00:20:53.397 "data_size": 63488 00:20:53.397 }, 00:20:53.397 { 00:20:53.397 "name": "pt2", 00:20:53.397 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:53.397 "is_configured": true, 00:20:53.397 "data_offset": 2048, 00:20:53.397 "data_size": 63488 00:20:53.397 }, 00:20:53.397 { 00:20:53.397 "name": "pt3", 00:20:53.397 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:53.397 "is_configured": true, 00:20:53.397 "data_offset": 2048, 00:20:53.397 "data_size": 63488 00:20:53.397 }, 00:20:53.397 { 00:20:53.397 "name": "pt4", 00:20:53.397 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:53.397 "is_configured": true, 00:20:53.397 "data_offset": 2048, 00:20:53.397 "data_size": 63488 00:20:53.397 } 00:20:53.397 ] 00:20:53.397 } 00:20:53.397 } 00:20:53.397 }' 00:20:53.397 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:53.397 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:53.397 pt2 00:20:53.397 pt3 00:20:53.397 pt4' 00:20:53.397 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:53.397 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:53.397 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:53.656 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:53.656 "name": "pt1", 00:20:53.656 "aliases": [ 00:20:53.656 "00000000-0000-0000-0000-000000000001" 00:20:53.656 ], 00:20:53.656 "product_name": "passthru", 00:20:53.656 "block_size": 512, 00:20:53.656 "num_blocks": 65536, 00:20:53.656 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:53.656 "assigned_rate_limits": { 00:20:53.656 "rw_ios_per_sec": 0, 00:20:53.656 "rw_mbytes_per_sec": 0, 00:20:53.656 "r_mbytes_per_sec": 0, 00:20:53.656 "w_mbytes_per_sec": 0 00:20:53.656 }, 00:20:53.656 "claimed": true, 00:20:53.656 "claim_type": "exclusive_write", 00:20:53.656 "zoned": false, 00:20:53.656 "supported_io_types": { 00:20:53.656 "read": true, 00:20:53.656 "write": true, 00:20:53.656 "unmap": true, 00:20:53.656 "flush": true, 00:20:53.656 "reset": true, 00:20:53.656 "nvme_admin": false, 00:20:53.656 "nvme_io": false, 00:20:53.656 "nvme_io_md": false, 00:20:53.656 "write_zeroes": true, 00:20:53.656 "zcopy": true, 00:20:53.656 "get_zone_info": false, 00:20:53.656 "zone_management": false, 00:20:53.656 "zone_append": false, 00:20:53.656 "compare": false, 00:20:53.656 "compare_and_write": false, 00:20:53.656 "abort": true, 00:20:53.656 "seek_hole": false, 00:20:53.656 "seek_data": false, 00:20:53.656 "copy": true, 00:20:53.656 "nvme_iov_md": false 00:20:53.656 }, 00:20:53.656 "memory_domains": [ 00:20:53.656 { 00:20:53.656 "dma_device_id": "system", 00:20:53.656 "dma_device_type": 1 00:20:53.656 }, 00:20:53.656 { 00:20:53.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.656 "dma_device_type": 2 00:20:53.656 } 00:20:53.656 ], 00:20:53.656 "driver_specific": { 00:20:53.656 "passthru": { 00:20:53.656 "name": "pt1", 00:20:53.656 "base_bdev_name": "malloc1" 00:20:53.656 } 00:20:53.656 } 00:20:53.656 }' 00:20:53.656 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:53.656 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:53.656 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:53.656 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:53.915 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:53.915 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:53.915 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:53.915 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:53.915 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:53.915 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:53.915 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.174 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:54.174 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.174 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:54.174 00:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.174 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.174 "name": "pt2", 00:20:54.174 "aliases": [ 00:20:54.174 "00000000-0000-0000-0000-000000000002" 00:20:54.174 ], 00:20:54.174 "product_name": "passthru", 00:20:54.174 "block_size": 512, 00:20:54.174 "num_blocks": 65536, 00:20:54.174 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:54.174 "assigned_rate_limits": { 00:20:54.174 "rw_ios_per_sec": 0, 00:20:54.174 "rw_mbytes_per_sec": 0, 00:20:54.174 "r_mbytes_per_sec": 0, 00:20:54.174 "w_mbytes_per_sec": 0 00:20:54.174 }, 00:20:54.174 "claimed": true, 00:20:54.174 "claim_type": "exclusive_write", 00:20:54.174 "zoned": false, 00:20:54.174 "supported_io_types": { 00:20:54.174 "read": true, 00:20:54.174 "write": true, 00:20:54.174 "unmap": true, 00:20:54.174 "flush": true, 00:20:54.174 "reset": true, 00:20:54.174 "nvme_admin": false, 00:20:54.174 "nvme_io": false, 00:20:54.174 "nvme_io_md": false, 00:20:54.174 "write_zeroes": true, 00:20:54.174 "zcopy": true, 00:20:54.174 "get_zone_info": false, 00:20:54.174 "zone_management": false, 00:20:54.174 "zone_append": false, 00:20:54.174 "compare": false, 00:20:54.174 "compare_and_write": false, 00:20:54.174 "abort": true, 00:20:54.174 "seek_hole": false, 00:20:54.174 "seek_data": false, 00:20:54.174 "copy": true, 00:20:54.174 "nvme_iov_md": false 00:20:54.174 }, 00:20:54.174 "memory_domains": [ 00:20:54.174 { 00:20:54.174 "dma_device_id": "system", 00:20:54.174 "dma_device_type": 1 00:20:54.174 }, 00:20:54.174 { 00:20:54.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.174 "dma_device_type": 2 00:20:54.174 } 00:20:54.174 ], 00:20:54.174 "driver_specific": { 00:20:54.174 "passthru": { 00:20:54.174 "name": "pt2", 00:20:54.174 "base_bdev_name": "malloc2" 00:20:54.174 } 00:20:54.174 } 00:20:54.174 }' 00:20:54.174 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.174 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.433 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:54.433 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.433 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.433 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:54.433 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.433 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.433 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:54.694 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.694 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.694 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:54.694 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.694 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:54.694 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.952 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.952 "name": "pt3", 00:20:54.953 "aliases": [ 00:20:54.953 "00000000-0000-0000-0000-000000000003" 00:20:54.953 ], 00:20:54.953 "product_name": "passthru", 00:20:54.953 "block_size": 512, 00:20:54.953 "num_blocks": 65536, 00:20:54.953 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:54.953 "assigned_rate_limits": { 00:20:54.953 "rw_ios_per_sec": 0, 00:20:54.953 "rw_mbytes_per_sec": 0, 00:20:54.953 "r_mbytes_per_sec": 0, 00:20:54.953 "w_mbytes_per_sec": 0 00:20:54.953 }, 00:20:54.953 "claimed": true, 00:20:54.953 "claim_type": "exclusive_write", 00:20:54.953 "zoned": false, 00:20:54.953 "supported_io_types": { 00:20:54.953 "read": true, 00:20:54.953 "write": true, 00:20:54.953 "unmap": true, 00:20:54.953 "flush": true, 00:20:54.953 "reset": true, 00:20:54.953 "nvme_admin": false, 00:20:54.953 "nvme_io": false, 00:20:54.953 "nvme_io_md": false, 00:20:54.953 "write_zeroes": true, 00:20:54.953 "zcopy": true, 00:20:54.953 "get_zone_info": false, 00:20:54.953 "zone_management": false, 00:20:54.953 "zone_append": false, 00:20:54.953 "compare": false, 00:20:54.953 "compare_and_write": false, 00:20:54.953 "abort": true, 00:20:54.953 "seek_hole": false, 00:20:54.953 "seek_data": false, 00:20:54.953 "copy": true, 00:20:54.953 "nvme_iov_md": false 00:20:54.953 }, 00:20:54.953 "memory_domains": [ 00:20:54.953 { 00:20:54.953 "dma_device_id": "system", 00:20:54.953 "dma_device_type": 1 00:20:54.953 }, 00:20:54.953 { 00:20:54.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.953 "dma_device_type": 2 00:20:54.953 } 00:20:54.953 ], 00:20:54.953 "driver_specific": { 00:20:54.953 "passthru": { 00:20:54.953 "name": "pt3", 00:20:54.953 "base_bdev_name": "malloc3" 00:20:54.953 } 00:20:54.953 } 00:20:54.953 }' 00:20:54.953 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.953 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.211 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:55.211 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.211 00:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.211 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.211 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.211 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.211 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.211 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.211 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.471 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.471 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:55.471 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:55.471 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:55.729 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:55.729 "name": "pt4", 00:20:55.729 "aliases": [ 00:20:55.729 "00000000-0000-0000-0000-000000000004" 00:20:55.729 ], 00:20:55.729 "product_name": "passthru", 00:20:55.729 "block_size": 512, 00:20:55.729 "num_blocks": 65536, 00:20:55.729 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:55.729 "assigned_rate_limits": { 00:20:55.729 "rw_ios_per_sec": 0, 00:20:55.729 "rw_mbytes_per_sec": 0, 00:20:55.729 "r_mbytes_per_sec": 0, 00:20:55.729 "w_mbytes_per_sec": 0 00:20:55.729 }, 00:20:55.729 "claimed": true, 00:20:55.729 "claim_type": "exclusive_write", 00:20:55.729 "zoned": false, 00:20:55.729 "supported_io_types": { 00:20:55.729 "read": true, 00:20:55.729 "write": true, 00:20:55.729 "unmap": true, 00:20:55.729 "flush": true, 00:20:55.729 "reset": true, 00:20:55.729 "nvme_admin": false, 00:20:55.729 "nvme_io": false, 00:20:55.729 "nvme_io_md": false, 00:20:55.729 "write_zeroes": true, 00:20:55.729 "zcopy": true, 00:20:55.729 "get_zone_info": false, 00:20:55.729 "zone_management": false, 00:20:55.729 "zone_append": false, 00:20:55.729 "compare": false, 00:20:55.729 "compare_and_write": false, 00:20:55.729 "abort": true, 00:20:55.729 "seek_hole": false, 00:20:55.729 "seek_data": false, 00:20:55.729 "copy": true, 00:20:55.729 "nvme_iov_md": false 00:20:55.729 }, 00:20:55.729 "memory_domains": [ 00:20:55.729 { 00:20:55.729 "dma_device_id": "system", 00:20:55.729 "dma_device_type": 1 00:20:55.729 }, 00:20:55.729 { 00:20:55.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:55.729 "dma_device_type": 2 00:20:55.729 } 00:20:55.729 ], 00:20:55.729 "driver_specific": { 00:20:55.729 "passthru": { 00:20:55.729 "name": "pt4", 00:20:55.729 "base_bdev_name": "malloc4" 00:20:55.729 } 00:20:55.729 } 00:20:55.729 }' 00:20:55.729 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.729 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.729 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:55.729 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.729 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.729 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.729 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.729 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.988 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.988 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.988 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.988 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.988 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:55.988 00:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:20:56.554 [2024-07-16 00:15:43.274938] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 381d4c37-0188-4d75-a10a-d6fa52444426 '!=' 381d4c37-0188-4d75-a10a-d6fa52444426 ']' 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3576865 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3576865 ']' 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3576865 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3576865 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3576865' 00:20:56.554 killing process with pid 3576865 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3576865 00:20:56.554 [2024-07-16 00:15:43.357976] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:56.554 [2024-07-16 00:15:43.358039] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:56.554 [2024-07-16 00:15:43.358100] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:56.554 [2024-07-16 00:15:43.358112] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26028f0 name raid_bdev1, state offline 00:20:56.554 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3576865 00:20:56.554 [2024-07-16 00:15:43.394637] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:56.839 00:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:56.839 00:20:56.839 real 0m17.595s 00:20:56.839 user 0m31.919s 00:20:56.839 sys 0m3.032s 00:20:56.839 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:56.839 00:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:56.839 ************************************ 00:20:56.839 END TEST raid_superblock_test 00:20:56.839 ************************************ 00:20:56.839 00:15:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:56.839 00:15:43 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:20:56.839 00:15:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:56.839 00:15:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:56.839 00:15:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:56.839 ************************************ 00:20:56.839 START TEST raid_read_error_test 00:20:56.839 ************************************ 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ht8Ow5eq7r 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3579463 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3579463 /var/tmp/spdk-raid.sock 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3579463 ']' 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:56.839 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:56.839 00:15:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:56.839 [2024-07-16 00:15:43.763733] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:20:56.839 [2024-07-16 00:15:43.763802] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3579463 ] 00:20:57.107 [2024-07-16 00:15:43.893819] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:57.107 [2024-07-16 00:15:44.000822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:57.365 [2024-07-16 00:15:44.073117] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:57.366 [2024-07-16 00:15:44.073154] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:57.933 00:15:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:57.933 00:15:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:57.933 00:15:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:57.933 00:15:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:58.501 BaseBdev1_malloc 00:20:58.501 00:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:58.501 true 00:20:58.501 00:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:59.068 [2024-07-16 00:15:45.929171] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:59.068 [2024-07-16 00:15:45.929220] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:59.068 [2024-07-16 00:15:45.929243] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb6d0d0 00:20:59.068 [2024-07-16 00:15:45.929256] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:59.068 [2024-07-16 00:15:45.931172] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:59.068 [2024-07-16 00:15:45.931201] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:59.068 BaseBdev1 00:20:59.068 00:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:59.068 00:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:59.326 BaseBdev2_malloc 00:20:59.326 00:15:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:59.893 true 00:20:59.893 00:15:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:00.152 [2024-07-16 00:15:46.953689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:00.152 [2024-07-16 00:15:46.953735] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:00.152 [2024-07-16 00:15:46.953758] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb71910 00:21:00.152 [2024-07-16 00:15:46.953771] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:00.152 [2024-07-16 00:15:46.955414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:00.152 [2024-07-16 00:15:46.955442] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:00.152 BaseBdev2 00:21:00.152 00:15:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:00.152 00:15:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:00.719 BaseBdev3_malloc 00:21:00.719 00:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:00.978 true 00:21:00.978 00:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:01.237 [2024-07-16 00:15:47.974185] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:01.237 [2024-07-16 00:15:47.974228] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:01.237 [2024-07-16 00:15:47.974249] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb73bd0 00:21:01.237 [2024-07-16 00:15:47.974261] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:01.237 [2024-07-16 00:15:47.975712] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:01.237 [2024-07-16 00:15:47.975739] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:01.237 BaseBdev3 00:21:01.237 00:15:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:01.237 00:15:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:01.496 BaseBdev4_malloc 00:21:01.496 00:15:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:01.755 true 00:21:01.755 00:15:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:02.014 [2024-07-16 00:15:48.744897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:02.014 [2024-07-16 00:15:48.744948] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:02.014 [2024-07-16 00:15:48.744971] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb74aa0 00:21:02.014 [2024-07-16 00:15:48.744984] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:02.014 [2024-07-16 00:15:48.746576] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:02.014 [2024-07-16 00:15:48.746604] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:02.014 BaseBdev4 00:21:02.014 00:15:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:02.273 [2024-07-16 00:15:48.989583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:02.273 [2024-07-16 00:15:48.990984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:02.273 [2024-07-16 00:15:48.991054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:02.273 [2024-07-16 00:15:48.991115] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:02.273 [2024-07-16 00:15:48.991352] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb6ec20 00:21:02.273 [2024-07-16 00:15:48.991363] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:02.273 [2024-07-16 00:15:48.991569] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9c3260 00:21:02.273 [2024-07-16 00:15:48.991723] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb6ec20 00:21:02.273 [2024-07-16 00:15:48.991732] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb6ec20 00:21:02.273 [2024-07-16 00:15:48.991842] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.273 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.532 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.532 "name": "raid_bdev1", 00:21:02.532 "uuid": "6adb781f-40ea-4e16-a553-41817939ad95", 00:21:02.532 "strip_size_kb": 64, 00:21:02.532 "state": "online", 00:21:02.532 "raid_level": "concat", 00:21:02.532 "superblock": true, 00:21:02.532 "num_base_bdevs": 4, 00:21:02.532 "num_base_bdevs_discovered": 4, 00:21:02.532 "num_base_bdevs_operational": 4, 00:21:02.532 "base_bdevs_list": [ 00:21:02.532 { 00:21:02.532 "name": "BaseBdev1", 00:21:02.532 "uuid": "78c54c2a-6527-54db-a093-c8b64f672d53", 00:21:02.532 "is_configured": true, 00:21:02.532 "data_offset": 2048, 00:21:02.532 "data_size": 63488 00:21:02.532 }, 00:21:02.532 { 00:21:02.532 "name": "BaseBdev2", 00:21:02.532 "uuid": "e3c51eef-7f84-5541-b828-10bea364caaf", 00:21:02.532 "is_configured": true, 00:21:02.532 "data_offset": 2048, 00:21:02.532 "data_size": 63488 00:21:02.532 }, 00:21:02.532 { 00:21:02.532 "name": "BaseBdev3", 00:21:02.532 "uuid": "963a85ab-87a8-5e9b-9a9f-cb3ecfabf0fc", 00:21:02.532 "is_configured": true, 00:21:02.532 "data_offset": 2048, 00:21:02.532 "data_size": 63488 00:21:02.532 }, 00:21:02.532 { 00:21:02.532 "name": "BaseBdev4", 00:21:02.532 "uuid": "96f8270d-7d5d-5bb9-95d4-b999386eaa6a", 00:21:02.532 "is_configured": true, 00:21:02.532 "data_offset": 2048, 00:21:02.532 "data_size": 63488 00:21:02.532 } 00:21:02.532 ] 00:21:02.532 }' 00:21:02.532 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.532 00:15:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:03.099 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:03.099 00:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:03.099 [2024-07-16 00:15:50.000536] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb60fc0 00:21:04.036 00:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.295 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:04.554 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.554 "name": "raid_bdev1", 00:21:04.554 "uuid": "6adb781f-40ea-4e16-a553-41817939ad95", 00:21:04.554 "strip_size_kb": 64, 00:21:04.554 "state": "online", 00:21:04.554 "raid_level": "concat", 00:21:04.554 "superblock": true, 00:21:04.554 "num_base_bdevs": 4, 00:21:04.554 "num_base_bdevs_discovered": 4, 00:21:04.554 "num_base_bdevs_operational": 4, 00:21:04.554 "base_bdevs_list": [ 00:21:04.554 { 00:21:04.554 "name": "BaseBdev1", 00:21:04.554 "uuid": "78c54c2a-6527-54db-a093-c8b64f672d53", 00:21:04.554 "is_configured": true, 00:21:04.554 "data_offset": 2048, 00:21:04.554 "data_size": 63488 00:21:04.554 }, 00:21:04.554 { 00:21:04.554 "name": "BaseBdev2", 00:21:04.554 "uuid": "e3c51eef-7f84-5541-b828-10bea364caaf", 00:21:04.554 "is_configured": true, 00:21:04.554 "data_offset": 2048, 00:21:04.554 "data_size": 63488 00:21:04.554 }, 00:21:04.554 { 00:21:04.554 "name": "BaseBdev3", 00:21:04.554 "uuid": "963a85ab-87a8-5e9b-9a9f-cb3ecfabf0fc", 00:21:04.554 "is_configured": true, 00:21:04.554 "data_offset": 2048, 00:21:04.554 "data_size": 63488 00:21:04.554 }, 00:21:04.554 { 00:21:04.554 "name": "BaseBdev4", 00:21:04.554 "uuid": "96f8270d-7d5d-5bb9-95d4-b999386eaa6a", 00:21:04.554 "is_configured": true, 00:21:04.554 "data_offset": 2048, 00:21:04.554 "data_size": 63488 00:21:04.554 } 00:21:04.554 ] 00:21:04.554 }' 00:21:04.554 00:15:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.554 00:15:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:05.122 00:15:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:05.381 [2024-07-16 00:15:52.242370] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:05.381 [2024-07-16 00:15:52.242402] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:05.381 [2024-07-16 00:15:52.245564] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:05.381 [2024-07-16 00:15:52.245602] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:05.381 [2024-07-16 00:15:52.245642] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:05.381 [2024-07-16 00:15:52.245653] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb6ec20 name raid_bdev1, state offline 00:21:05.381 0 00:21:05.381 00:15:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3579463 00:21:05.381 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3579463 ']' 00:21:05.382 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3579463 00:21:05.382 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:21:05.382 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:05.382 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3579463 00:21:05.382 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:05.382 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:05.382 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3579463' 00:21:05.382 killing process with pid 3579463 00:21:05.382 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3579463 00:21:05.382 [2024-07-16 00:15:52.331090] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:05.382 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3579463 00:21:05.641 [2024-07-16 00:15:52.361864] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:05.641 00:15:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ht8Ow5eq7r 00:21:05.641 00:15:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:05.641 00:15:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:05.641 00:15:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:21:05.641 00:15:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:05.900 00:15:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:05.900 00:15:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:05.900 00:15:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:21:05.900 00:21:05.900 real 0m8.907s 00:21:05.900 user 0m14.541s 00:21:05.900 sys 0m1.549s 00:21:05.900 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:05.900 00:15:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:05.900 ************************************ 00:21:05.900 END TEST raid_read_error_test 00:21:05.900 ************************************ 00:21:05.900 00:15:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:05.900 00:15:52 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:21:05.900 00:15:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:05.900 00:15:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:05.900 00:15:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:05.900 ************************************ 00:21:05.900 START TEST raid_write_error_test 00:21:05.900 ************************************ 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.yOZE02zXq0 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3580780 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3580780 /var/tmp/spdk-raid.sock 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3580780 ']' 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:05.900 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:05.900 00:15:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:05.900 [2024-07-16 00:15:52.764863] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:21:05.900 [2024-07-16 00:15:52.764940] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3580780 ] 00:21:06.159 [2024-07-16 00:15:52.895073] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:06.159 [2024-07-16 00:15:52.995653] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:06.159 [2024-07-16 00:15:53.062357] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:06.159 [2024-07-16 00:15:53.062398] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:07.092 00:15:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:07.092 00:15:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:07.092 00:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:07.092 00:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:07.092 BaseBdev1_malloc 00:21:07.092 00:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:07.350 true 00:21:07.350 00:15:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:07.641 [2024-07-16 00:15:54.427967] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:07.641 [2024-07-16 00:15:54.428015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.641 [2024-07-16 00:15:54.428035] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22dd0d0 00:21:07.641 [2024-07-16 00:15:54.428047] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.641 [2024-07-16 00:15:54.429796] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.641 [2024-07-16 00:15:54.429825] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:07.641 BaseBdev1 00:21:07.641 00:15:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:07.641 00:15:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:07.898 BaseBdev2_malloc 00:21:07.898 00:15:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:08.156 true 00:21:08.156 00:15:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:08.415 [2024-07-16 00:15:55.170697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:08.415 [2024-07-16 00:15:55.170746] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:08.415 [2024-07-16 00:15:55.170764] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22e1910 00:21:08.415 [2024-07-16 00:15:55.170776] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:08.415 [2024-07-16 00:15:55.172217] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:08.415 [2024-07-16 00:15:55.172245] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:08.415 BaseBdev2 00:21:08.415 00:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:08.415 00:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:08.673 BaseBdev3_malloc 00:21:08.673 00:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:08.930 true 00:21:08.931 00:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:09.189 [2024-07-16 00:15:55.913287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:09.189 [2024-07-16 00:15:55.913330] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:09.189 [2024-07-16 00:15:55.913348] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22e3bd0 00:21:09.189 [2024-07-16 00:15:55.913361] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:09.189 [2024-07-16 00:15:55.914743] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:09.189 [2024-07-16 00:15:55.914769] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:09.189 BaseBdev3 00:21:09.189 00:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:09.189 00:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:09.449 BaseBdev4_malloc 00:21:09.449 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:09.708 true 00:21:09.708 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:09.708 [2024-07-16 00:15:56.655787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:09.708 [2024-07-16 00:15:56.655829] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:09.708 [2024-07-16 00:15:56.655849] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22e4aa0 00:21:09.708 [2024-07-16 00:15:56.655862] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:09.708 [2024-07-16 00:15:56.657330] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:09.708 [2024-07-16 00:15:56.657357] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:09.967 BaseBdev4 00:21:09.967 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:09.967 [2024-07-16 00:15:56.900458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:09.967 [2024-07-16 00:15:56.901618] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:09.967 [2024-07-16 00:15:56.901683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:09.967 [2024-07-16 00:15:56.901743] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:09.967 [2024-07-16 00:15:56.901977] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22dec20 00:21:09.967 [2024-07-16 00:15:56.901988] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:09.967 [2024-07-16 00:15:56.902172] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2133260 00:21:09.967 [2024-07-16 00:15:56.902317] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22dec20 00:21:09.967 [2024-07-16 00:15:56.902327] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22dec20 00:21:09.967 [2024-07-16 00:15:56.902423] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.226 00:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:10.486 00:15:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.486 "name": "raid_bdev1", 00:21:10.486 "uuid": "1d6c7f8d-9f83-4d08-9b95-6ea3c5b85dd1", 00:21:10.486 "strip_size_kb": 64, 00:21:10.486 "state": "online", 00:21:10.486 "raid_level": "concat", 00:21:10.486 "superblock": true, 00:21:10.486 "num_base_bdevs": 4, 00:21:10.486 "num_base_bdevs_discovered": 4, 00:21:10.486 "num_base_bdevs_operational": 4, 00:21:10.486 "base_bdevs_list": [ 00:21:10.486 { 00:21:10.486 "name": "BaseBdev1", 00:21:10.486 "uuid": "a9c87c97-d574-51a4-a1fd-d951aac72c47", 00:21:10.486 "is_configured": true, 00:21:10.486 "data_offset": 2048, 00:21:10.486 "data_size": 63488 00:21:10.486 }, 00:21:10.486 { 00:21:10.486 "name": "BaseBdev2", 00:21:10.486 "uuid": "1f31ea39-5d89-511e-bfee-db956a37ea82", 00:21:10.486 "is_configured": true, 00:21:10.486 "data_offset": 2048, 00:21:10.486 "data_size": 63488 00:21:10.486 }, 00:21:10.486 { 00:21:10.486 "name": "BaseBdev3", 00:21:10.486 "uuid": "05f7a244-67b1-5638-92cf-9a595da93de2", 00:21:10.486 "is_configured": true, 00:21:10.486 "data_offset": 2048, 00:21:10.486 "data_size": 63488 00:21:10.486 }, 00:21:10.486 { 00:21:10.486 "name": "BaseBdev4", 00:21:10.486 "uuid": "5b1abbec-6361-5e11-be4c-d860e12fbd18", 00:21:10.486 "is_configured": true, 00:21:10.486 "data_offset": 2048, 00:21:10.486 "data_size": 63488 00:21:10.486 } 00:21:10.486 ] 00:21:10.486 }' 00:21:10.486 00:15:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.486 00:15:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:11.054 00:15:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:11.054 00:15:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:11.054 [2024-07-16 00:15:57.895386] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d0fc0 00:21:12.043 00:15:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.302 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:12.561 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.561 "name": "raid_bdev1", 00:21:12.561 "uuid": "1d6c7f8d-9f83-4d08-9b95-6ea3c5b85dd1", 00:21:12.561 "strip_size_kb": 64, 00:21:12.561 "state": "online", 00:21:12.561 "raid_level": "concat", 00:21:12.561 "superblock": true, 00:21:12.561 "num_base_bdevs": 4, 00:21:12.561 "num_base_bdevs_discovered": 4, 00:21:12.561 "num_base_bdevs_operational": 4, 00:21:12.561 "base_bdevs_list": [ 00:21:12.561 { 00:21:12.561 "name": "BaseBdev1", 00:21:12.561 "uuid": "a9c87c97-d574-51a4-a1fd-d951aac72c47", 00:21:12.561 "is_configured": true, 00:21:12.561 "data_offset": 2048, 00:21:12.561 "data_size": 63488 00:21:12.561 }, 00:21:12.561 { 00:21:12.561 "name": "BaseBdev2", 00:21:12.561 "uuid": "1f31ea39-5d89-511e-bfee-db956a37ea82", 00:21:12.561 "is_configured": true, 00:21:12.561 "data_offset": 2048, 00:21:12.561 "data_size": 63488 00:21:12.561 }, 00:21:12.561 { 00:21:12.561 "name": "BaseBdev3", 00:21:12.561 "uuid": "05f7a244-67b1-5638-92cf-9a595da93de2", 00:21:12.562 "is_configured": true, 00:21:12.562 "data_offset": 2048, 00:21:12.562 "data_size": 63488 00:21:12.562 }, 00:21:12.562 { 00:21:12.562 "name": "BaseBdev4", 00:21:12.562 "uuid": "5b1abbec-6361-5e11-be4c-d860e12fbd18", 00:21:12.562 "is_configured": true, 00:21:12.562 "data_offset": 2048, 00:21:12.562 "data_size": 63488 00:21:12.562 } 00:21:12.562 ] 00:21:12.562 }' 00:21:12.562 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.562 00:15:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:13.130 00:15:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:13.130 [2024-07-16 00:16:00.060282] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:13.130 [2024-07-16 00:16:00.060323] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:13.130 [2024-07-16 00:16:00.063481] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:13.130 [2024-07-16 00:16:00.063521] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:13.130 [2024-07-16 00:16:00.063560] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:13.130 [2024-07-16 00:16:00.063571] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22dec20 name raid_bdev1, state offline 00:21:13.130 0 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3580780 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3580780 ']' 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3580780 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3580780 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3580780' 00:21:13.389 killing process with pid 3580780 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3580780 00:21:13.389 [2024-07-16 00:16:00.132590] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:13.389 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3580780 00:21:13.389 [2024-07-16 00:16:00.163752] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:13.649 00:16:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.yOZE02zXq0 00:21:13.649 00:16:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:13.649 00:16:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:13.649 00:16:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:21:13.649 00:16:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:13.649 00:16:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:13.649 00:16:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:13.649 00:16:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:21:13.649 00:21:13.649 real 0m7.712s 00:21:13.649 user 0m12.391s 00:21:13.649 sys 0m1.335s 00:21:13.649 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:13.649 00:16:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:13.649 ************************************ 00:21:13.649 END TEST raid_write_error_test 00:21:13.649 ************************************ 00:21:13.649 00:16:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:13.649 00:16:00 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:21:13.649 00:16:00 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:21:13.649 00:16:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:13.649 00:16:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:13.649 00:16:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:13.649 ************************************ 00:21:13.649 START TEST raid_state_function_test 00:21:13.649 ************************************ 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3581860 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3581860' 00:21:13.649 Process raid pid: 3581860 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3581860 /var/tmp/spdk-raid.sock 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3581860 ']' 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:13.649 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:13.649 00:16:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:13.649 [2024-07-16 00:16:00.598121] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:21:13.649 [2024-07-16 00:16:00.598267] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:13.909 [2024-07-16 00:16:00.794599] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:14.168 [2024-07-16 00:16:00.898080] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:14.168 [2024-07-16 00:16:00.959706] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:14.168 [2024-07-16 00:16:00.959738] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:14.168 00:16:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:14.168 00:16:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:21:14.168 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:14.427 [2024-07-16 00:16:01.246839] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:14.427 [2024-07-16 00:16:01.246880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:14.427 [2024-07-16 00:16:01.246891] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:14.427 [2024-07-16 00:16:01.246903] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:14.427 [2024-07-16 00:16:01.246912] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:14.427 [2024-07-16 00:16:01.246923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:14.427 [2024-07-16 00:16:01.246938] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:14.427 [2024-07-16 00:16:01.246949] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.427 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:14.686 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.686 "name": "Existed_Raid", 00:21:14.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.686 "strip_size_kb": 0, 00:21:14.686 "state": "configuring", 00:21:14.686 "raid_level": "raid1", 00:21:14.686 "superblock": false, 00:21:14.686 "num_base_bdevs": 4, 00:21:14.686 "num_base_bdevs_discovered": 0, 00:21:14.686 "num_base_bdevs_operational": 4, 00:21:14.686 "base_bdevs_list": [ 00:21:14.686 { 00:21:14.686 "name": "BaseBdev1", 00:21:14.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.686 "is_configured": false, 00:21:14.686 "data_offset": 0, 00:21:14.686 "data_size": 0 00:21:14.686 }, 00:21:14.686 { 00:21:14.686 "name": "BaseBdev2", 00:21:14.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.686 "is_configured": false, 00:21:14.686 "data_offset": 0, 00:21:14.687 "data_size": 0 00:21:14.687 }, 00:21:14.687 { 00:21:14.687 "name": "BaseBdev3", 00:21:14.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.687 "is_configured": false, 00:21:14.687 "data_offset": 0, 00:21:14.687 "data_size": 0 00:21:14.687 }, 00:21:14.687 { 00:21:14.687 "name": "BaseBdev4", 00:21:14.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.687 "is_configured": false, 00:21:14.687 "data_offset": 0, 00:21:14.687 "data_size": 0 00:21:14.687 } 00:21:14.687 ] 00:21:14.687 }' 00:21:14.687 00:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.687 00:16:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:15.253 00:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:15.511 [2024-07-16 00:16:02.345593] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:15.511 [2024-07-16 00:16:02.345621] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250caa0 name Existed_Raid, state configuring 00:21:15.511 00:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:15.770 [2024-07-16 00:16:02.594404] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:15.770 [2024-07-16 00:16:02.594435] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:15.770 [2024-07-16 00:16:02.594445] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:15.770 [2024-07-16 00:16:02.594456] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:15.770 [2024-07-16 00:16:02.594465] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:15.770 [2024-07-16 00:16:02.594476] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:15.770 [2024-07-16 00:16:02.594485] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:15.770 [2024-07-16 00:16:02.594496] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:15.770 00:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:16.028 [2024-07-16 00:16:02.852984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:16.028 BaseBdev1 00:21:16.028 00:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:16.028 00:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:16.028 00:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:16.028 00:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:16.028 00:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:16.028 00:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:16.028 00:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:16.286 00:16:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:16.544 [ 00:21:16.544 { 00:21:16.544 "name": "BaseBdev1", 00:21:16.544 "aliases": [ 00:21:16.544 "eadfae86-f3b6-4d4c-8584-2c421ded6fbc" 00:21:16.544 ], 00:21:16.544 "product_name": "Malloc disk", 00:21:16.544 "block_size": 512, 00:21:16.544 "num_blocks": 65536, 00:21:16.544 "uuid": "eadfae86-f3b6-4d4c-8584-2c421ded6fbc", 00:21:16.544 "assigned_rate_limits": { 00:21:16.544 "rw_ios_per_sec": 0, 00:21:16.544 "rw_mbytes_per_sec": 0, 00:21:16.544 "r_mbytes_per_sec": 0, 00:21:16.544 "w_mbytes_per_sec": 0 00:21:16.544 }, 00:21:16.544 "claimed": true, 00:21:16.544 "claim_type": "exclusive_write", 00:21:16.544 "zoned": false, 00:21:16.544 "supported_io_types": { 00:21:16.544 "read": true, 00:21:16.544 "write": true, 00:21:16.544 "unmap": true, 00:21:16.544 "flush": true, 00:21:16.544 "reset": true, 00:21:16.544 "nvme_admin": false, 00:21:16.544 "nvme_io": false, 00:21:16.544 "nvme_io_md": false, 00:21:16.544 "write_zeroes": true, 00:21:16.544 "zcopy": true, 00:21:16.544 "get_zone_info": false, 00:21:16.544 "zone_management": false, 00:21:16.544 "zone_append": false, 00:21:16.544 "compare": false, 00:21:16.544 "compare_and_write": false, 00:21:16.544 "abort": true, 00:21:16.544 "seek_hole": false, 00:21:16.544 "seek_data": false, 00:21:16.544 "copy": true, 00:21:16.544 "nvme_iov_md": false 00:21:16.544 }, 00:21:16.544 "memory_domains": [ 00:21:16.544 { 00:21:16.544 "dma_device_id": "system", 00:21:16.544 "dma_device_type": 1 00:21:16.544 }, 00:21:16.544 { 00:21:16.544 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:16.544 "dma_device_type": 2 00:21:16.544 } 00:21:16.544 ], 00:21:16.544 "driver_specific": {} 00:21:16.544 } 00:21:16.544 ] 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.544 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:16.802 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.802 "name": "Existed_Raid", 00:21:16.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.802 "strip_size_kb": 0, 00:21:16.802 "state": "configuring", 00:21:16.802 "raid_level": "raid1", 00:21:16.802 "superblock": false, 00:21:16.802 "num_base_bdevs": 4, 00:21:16.802 "num_base_bdevs_discovered": 1, 00:21:16.802 "num_base_bdevs_operational": 4, 00:21:16.802 "base_bdevs_list": [ 00:21:16.802 { 00:21:16.802 "name": "BaseBdev1", 00:21:16.802 "uuid": "eadfae86-f3b6-4d4c-8584-2c421ded6fbc", 00:21:16.802 "is_configured": true, 00:21:16.802 "data_offset": 0, 00:21:16.802 "data_size": 65536 00:21:16.802 }, 00:21:16.802 { 00:21:16.802 "name": "BaseBdev2", 00:21:16.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.802 "is_configured": false, 00:21:16.802 "data_offset": 0, 00:21:16.802 "data_size": 0 00:21:16.802 }, 00:21:16.802 { 00:21:16.802 "name": "BaseBdev3", 00:21:16.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.802 "is_configured": false, 00:21:16.802 "data_offset": 0, 00:21:16.802 "data_size": 0 00:21:16.802 }, 00:21:16.802 { 00:21:16.802 "name": "BaseBdev4", 00:21:16.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.802 "is_configured": false, 00:21:16.802 "data_offset": 0, 00:21:16.802 "data_size": 0 00:21:16.802 } 00:21:16.802 ] 00:21:16.802 }' 00:21:16.802 00:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.802 00:16:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:17.368 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:17.626 [2024-07-16 00:16:04.429156] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:17.626 [2024-07-16 00:16:04.429193] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250c310 name Existed_Raid, state configuring 00:21:17.626 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:18.192 [2024-07-16 00:16:04.938522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:18.192 [2024-07-16 00:16:04.940014] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:18.192 [2024-07-16 00:16:04.940046] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:18.192 [2024-07-16 00:16:04.940057] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:18.192 [2024-07-16 00:16:04.940069] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:18.192 [2024-07-16 00:16:04.940079] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:18.192 [2024-07-16 00:16:04.940091] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:18.192 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:18.192 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:18.192 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:18.192 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:18.192 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:18.192 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:18.192 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:18.192 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:18.192 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:18.193 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:18.193 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:18.193 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:18.193 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.193 00:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:18.451 00:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.451 "name": "Existed_Raid", 00:21:18.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.451 "strip_size_kb": 0, 00:21:18.451 "state": "configuring", 00:21:18.451 "raid_level": "raid1", 00:21:18.451 "superblock": false, 00:21:18.451 "num_base_bdevs": 4, 00:21:18.451 "num_base_bdevs_discovered": 1, 00:21:18.451 "num_base_bdevs_operational": 4, 00:21:18.451 "base_bdevs_list": [ 00:21:18.451 { 00:21:18.451 "name": "BaseBdev1", 00:21:18.451 "uuid": "eadfae86-f3b6-4d4c-8584-2c421ded6fbc", 00:21:18.451 "is_configured": true, 00:21:18.451 "data_offset": 0, 00:21:18.451 "data_size": 65536 00:21:18.451 }, 00:21:18.451 { 00:21:18.451 "name": "BaseBdev2", 00:21:18.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.451 "is_configured": false, 00:21:18.451 "data_offset": 0, 00:21:18.451 "data_size": 0 00:21:18.451 }, 00:21:18.451 { 00:21:18.451 "name": "BaseBdev3", 00:21:18.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.451 "is_configured": false, 00:21:18.451 "data_offset": 0, 00:21:18.451 "data_size": 0 00:21:18.451 }, 00:21:18.451 { 00:21:18.451 "name": "BaseBdev4", 00:21:18.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.451 "is_configured": false, 00:21:18.451 "data_offset": 0, 00:21:18.451 "data_size": 0 00:21:18.451 } 00:21:18.451 ] 00:21:18.451 }' 00:21:18.451 00:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.451 00:16:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:19.017 00:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:19.276 [2024-07-16 00:16:06.049115] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:19.276 BaseBdev2 00:21:19.276 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:19.276 00:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:19.276 00:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:19.276 00:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:19.276 00:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:19.276 00:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:19.276 00:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:19.534 00:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:20.101 [ 00:21:20.101 { 00:21:20.101 "name": "BaseBdev2", 00:21:20.101 "aliases": [ 00:21:20.101 "bfa0f8c2-b4a2-4043-a454-890b22c3ad0a" 00:21:20.101 ], 00:21:20.101 "product_name": "Malloc disk", 00:21:20.101 "block_size": 512, 00:21:20.101 "num_blocks": 65536, 00:21:20.101 "uuid": "bfa0f8c2-b4a2-4043-a454-890b22c3ad0a", 00:21:20.101 "assigned_rate_limits": { 00:21:20.101 "rw_ios_per_sec": 0, 00:21:20.101 "rw_mbytes_per_sec": 0, 00:21:20.101 "r_mbytes_per_sec": 0, 00:21:20.101 "w_mbytes_per_sec": 0 00:21:20.101 }, 00:21:20.101 "claimed": true, 00:21:20.101 "claim_type": "exclusive_write", 00:21:20.101 "zoned": false, 00:21:20.101 "supported_io_types": { 00:21:20.101 "read": true, 00:21:20.101 "write": true, 00:21:20.101 "unmap": true, 00:21:20.101 "flush": true, 00:21:20.101 "reset": true, 00:21:20.101 "nvme_admin": false, 00:21:20.101 "nvme_io": false, 00:21:20.101 "nvme_io_md": false, 00:21:20.101 "write_zeroes": true, 00:21:20.101 "zcopy": true, 00:21:20.101 "get_zone_info": false, 00:21:20.101 "zone_management": false, 00:21:20.101 "zone_append": false, 00:21:20.101 "compare": false, 00:21:20.101 "compare_and_write": false, 00:21:20.101 "abort": true, 00:21:20.101 "seek_hole": false, 00:21:20.101 "seek_data": false, 00:21:20.101 "copy": true, 00:21:20.101 "nvme_iov_md": false 00:21:20.101 }, 00:21:20.101 "memory_domains": [ 00:21:20.101 { 00:21:20.101 "dma_device_id": "system", 00:21:20.101 "dma_device_type": 1 00:21:20.101 }, 00:21:20.101 { 00:21:20.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.101 "dma_device_type": 2 00:21:20.101 } 00:21:20.101 ], 00:21:20.101 "driver_specific": {} 00:21:20.101 } 00:21:20.101 ] 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.101 00:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:20.359 00:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.359 "name": "Existed_Raid", 00:21:20.359 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.359 "strip_size_kb": 0, 00:21:20.359 "state": "configuring", 00:21:20.359 "raid_level": "raid1", 00:21:20.359 "superblock": false, 00:21:20.359 "num_base_bdevs": 4, 00:21:20.359 "num_base_bdevs_discovered": 2, 00:21:20.359 "num_base_bdevs_operational": 4, 00:21:20.359 "base_bdevs_list": [ 00:21:20.359 { 00:21:20.359 "name": "BaseBdev1", 00:21:20.359 "uuid": "eadfae86-f3b6-4d4c-8584-2c421ded6fbc", 00:21:20.359 "is_configured": true, 00:21:20.359 "data_offset": 0, 00:21:20.359 "data_size": 65536 00:21:20.359 }, 00:21:20.359 { 00:21:20.359 "name": "BaseBdev2", 00:21:20.359 "uuid": "bfa0f8c2-b4a2-4043-a454-890b22c3ad0a", 00:21:20.359 "is_configured": true, 00:21:20.359 "data_offset": 0, 00:21:20.359 "data_size": 65536 00:21:20.359 }, 00:21:20.359 { 00:21:20.359 "name": "BaseBdev3", 00:21:20.359 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.359 "is_configured": false, 00:21:20.359 "data_offset": 0, 00:21:20.359 "data_size": 0 00:21:20.359 }, 00:21:20.359 { 00:21:20.359 "name": "BaseBdev4", 00:21:20.359 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.359 "is_configured": false, 00:21:20.359 "data_offset": 0, 00:21:20.359 "data_size": 0 00:21:20.359 } 00:21:20.359 ] 00:21:20.359 }' 00:21:20.359 00:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.359 00:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:20.925 00:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:21.183 [2024-07-16 00:16:07.913459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:21.183 BaseBdev3 00:21:21.183 00:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:21.183 00:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:21.183 00:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:21.183 00:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:21.183 00:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:21.183 00:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:21.183 00:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:21.183 00:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:21.441 [ 00:21:21.441 { 00:21:21.441 "name": "BaseBdev3", 00:21:21.441 "aliases": [ 00:21:21.441 "c691951e-01b9-4b10-acc8-b339cfb0688a" 00:21:21.441 ], 00:21:21.441 "product_name": "Malloc disk", 00:21:21.441 "block_size": 512, 00:21:21.441 "num_blocks": 65536, 00:21:21.441 "uuid": "c691951e-01b9-4b10-acc8-b339cfb0688a", 00:21:21.441 "assigned_rate_limits": { 00:21:21.441 "rw_ios_per_sec": 0, 00:21:21.441 "rw_mbytes_per_sec": 0, 00:21:21.441 "r_mbytes_per_sec": 0, 00:21:21.441 "w_mbytes_per_sec": 0 00:21:21.441 }, 00:21:21.441 "claimed": true, 00:21:21.441 "claim_type": "exclusive_write", 00:21:21.441 "zoned": false, 00:21:21.441 "supported_io_types": { 00:21:21.441 "read": true, 00:21:21.441 "write": true, 00:21:21.441 "unmap": true, 00:21:21.441 "flush": true, 00:21:21.441 "reset": true, 00:21:21.441 "nvme_admin": false, 00:21:21.441 "nvme_io": false, 00:21:21.441 "nvme_io_md": false, 00:21:21.441 "write_zeroes": true, 00:21:21.441 "zcopy": true, 00:21:21.441 "get_zone_info": false, 00:21:21.441 "zone_management": false, 00:21:21.441 "zone_append": false, 00:21:21.441 "compare": false, 00:21:21.441 "compare_and_write": false, 00:21:21.441 "abort": true, 00:21:21.441 "seek_hole": false, 00:21:21.441 "seek_data": false, 00:21:21.441 "copy": true, 00:21:21.441 "nvme_iov_md": false 00:21:21.441 }, 00:21:21.441 "memory_domains": [ 00:21:21.441 { 00:21:21.441 "dma_device_id": "system", 00:21:21.441 "dma_device_type": 1 00:21:21.441 }, 00:21:21.441 { 00:21:21.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.441 "dma_device_type": 2 00:21:21.441 } 00:21:21.441 ], 00:21:21.441 "driver_specific": {} 00:21:21.441 } 00:21:21.441 ] 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.441 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:21.700 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:21.700 "name": "Existed_Raid", 00:21:21.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.700 "strip_size_kb": 0, 00:21:21.700 "state": "configuring", 00:21:21.700 "raid_level": "raid1", 00:21:21.700 "superblock": false, 00:21:21.700 "num_base_bdevs": 4, 00:21:21.700 "num_base_bdevs_discovered": 3, 00:21:21.700 "num_base_bdevs_operational": 4, 00:21:21.700 "base_bdevs_list": [ 00:21:21.700 { 00:21:21.700 "name": "BaseBdev1", 00:21:21.700 "uuid": "eadfae86-f3b6-4d4c-8584-2c421ded6fbc", 00:21:21.700 "is_configured": true, 00:21:21.700 "data_offset": 0, 00:21:21.700 "data_size": 65536 00:21:21.700 }, 00:21:21.700 { 00:21:21.700 "name": "BaseBdev2", 00:21:21.700 "uuid": "bfa0f8c2-b4a2-4043-a454-890b22c3ad0a", 00:21:21.700 "is_configured": true, 00:21:21.700 "data_offset": 0, 00:21:21.700 "data_size": 65536 00:21:21.700 }, 00:21:21.700 { 00:21:21.700 "name": "BaseBdev3", 00:21:21.700 "uuid": "c691951e-01b9-4b10-acc8-b339cfb0688a", 00:21:21.700 "is_configured": true, 00:21:21.700 "data_offset": 0, 00:21:21.700 "data_size": 65536 00:21:21.700 }, 00:21:21.700 { 00:21:21.700 "name": "BaseBdev4", 00:21:21.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.700 "is_configured": false, 00:21:21.700 "data_offset": 0, 00:21:21.700 "data_size": 0 00:21:21.700 } 00:21:21.700 ] 00:21:21.700 }' 00:21:21.700 00:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:21.700 00:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:22.266 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:22.525 [2024-07-16 00:16:09.308571] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:22.525 [2024-07-16 00:16:09.308606] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x250d350 00:21:22.525 [2024-07-16 00:16:09.308614] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:22.525 [2024-07-16 00:16:09.308869] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x250d020 00:21:22.525 [2024-07-16 00:16:09.309014] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x250d350 00:21:22.525 [2024-07-16 00:16:09.309024] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x250d350 00:21:22.525 [2024-07-16 00:16:09.309183] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:22.525 BaseBdev4 00:21:22.525 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:22.525 00:16:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:22.525 00:16:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:22.525 00:16:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:22.525 00:16:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:22.525 00:16:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:22.525 00:16:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:22.783 00:16:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:23.042 [ 00:21:23.042 { 00:21:23.042 "name": "BaseBdev4", 00:21:23.042 "aliases": [ 00:21:23.042 "6c8acf02-4a56-47d4-bfaf-65ee5ab8cdbe" 00:21:23.042 ], 00:21:23.042 "product_name": "Malloc disk", 00:21:23.042 "block_size": 512, 00:21:23.042 "num_blocks": 65536, 00:21:23.042 "uuid": "6c8acf02-4a56-47d4-bfaf-65ee5ab8cdbe", 00:21:23.042 "assigned_rate_limits": { 00:21:23.042 "rw_ios_per_sec": 0, 00:21:23.042 "rw_mbytes_per_sec": 0, 00:21:23.042 "r_mbytes_per_sec": 0, 00:21:23.042 "w_mbytes_per_sec": 0 00:21:23.042 }, 00:21:23.042 "claimed": true, 00:21:23.042 "claim_type": "exclusive_write", 00:21:23.042 "zoned": false, 00:21:23.042 "supported_io_types": { 00:21:23.042 "read": true, 00:21:23.042 "write": true, 00:21:23.042 "unmap": true, 00:21:23.042 "flush": true, 00:21:23.042 "reset": true, 00:21:23.042 "nvme_admin": false, 00:21:23.042 "nvme_io": false, 00:21:23.042 "nvme_io_md": false, 00:21:23.042 "write_zeroes": true, 00:21:23.042 "zcopy": true, 00:21:23.042 "get_zone_info": false, 00:21:23.042 "zone_management": false, 00:21:23.042 "zone_append": false, 00:21:23.042 "compare": false, 00:21:23.042 "compare_and_write": false, 00:21:23.042 "abort": true, 00:21:23.042 "seek_hole": false, 00:21:23.042 "seek_data": false, 00:21:23.042 "copy": true, 00:21:23.042 "nvme_iov_md": false 00:21:23.042 }, 00:21:23.042 "memory_domains": [ 00:21:23.042 { 00:21:23.042 "dma_device_id": "system", 00:21:23.042 "dma_device_type": 1 00:21:23.042 }, 00:21:23.042 { 00:21:23.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.042 "dma_device_type": 2 00:21:23.042 } 00:21:23.042 ], 00:21:23.042 "driver_specific": {} 00:21:23.042 } 00:21:23.042 ] 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:23.042 00:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.300 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:23.300 "name": "Existed_Raid", 00:21:23.300 "uuid": "e3a712b5-1dfb-4963-a920-0ab921a8557d", 00:21:23.300 "strip_size_kb": 0, 00:21:23.300 "state": "online", 00:21:23.301 "raid_level": "raid1", 00:21:23.301 "superblock": false, 00:21:23.301 "num_base_bdevs": 4, 00:21:23.301 "num_base_bdevs_discovered": 4, 00:21:23.301 "num_base_bdevs_operational": 4, 00:21:23.301 "base_bdevs_list": [ 00:21:23.301 { 00:21:23.301 "name": "BaseBdev1", 00:21:23.301 "uuid": "eadfae86-f3b6-4d4c-8584-2c421ded6fbc", 00:21:23.301 "is_configured": true, 00:21:23.301 "data_offset": 0, 00:21:23.301 "data_size": 65536 00:21:23.301 }, 00:21:23.301 { 00:21:23.301 "name": "BaseBdev2", 00:21:23.301 "uuid": "bfa0f8c2-b4a2-4043-a454-890b22c3ad0a", 00:21:23.301 "is_configured": true, 00:21:23.301 "data_offset": 0, 00:21:23.301 "data_size": 65536 00:21:23.301 }, 00:21:23.301 { 00:21:23.301 "name": "BaseBdev3", 00:21:23.301 "uuid": "c691951e-01b9-4b10-acc8-b339cfb0688a", 00:21:23.301 "is_configured": true, 00:21:23.301 "data_offset": 0, 00:21:23.301 "data_size": 65536 00:21:23.301 }, 00:21:23.301 { 00:21:23.301 "name": "BaseBdev4", 00:21:23.301 "uuid": "6c8acf02-4a56-47d4-bfaf-65ee5ab8cdbe", 00:21:23.301 "is_configured": true, 00:21:23.301 "data_offset": 0, 00:21:23.301 "data_size": 65536 00:21:23.301 } 00:21:23.301 ] 00:21:23.301 }' 00:21:23.301 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:23.301 00:16:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:23.869 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:23.869 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:23.869 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:23.869 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:23.869 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:23.869 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:23.869 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:23.869 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:23.869 [2024-07-16 00:16:10.776795] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:23.869 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:23.869 "name": "Existed_Raid", 00:21:23.869 "aliases": [ 00:21:23.869 "e3a712b5-1dfb-4963-a920-0ab921a8557d" 00:21:23.869 ], 00:21:23.869 "product_name": "Raid Volume", 00:21:23.869 "block_size": 512, 00:21:23.869 "num_blocks": 65536, 00:21:23.869 "uuid": "e3a712b5-1dfb-4963-a920-0ab921a8557d", 00:21:23.869 "assigned_rate_limits": { 00:21:23.869 "rw_ios_per_sec": 0, 00:21:23.869 "rw_mbytes_per_sec": 0, 00:21:23.869 "r_mbytes_per_sec": 0, 00:21:23.869 "w_mbytes_per_sec": 0 00:21:23.869 }, 00:21:23.869 "claimed": false, 00:21:23.869 "zoned": false, 00:21:23.869 "supported_io_types": { 00:21:23.869 "read": true, 00:21:23.869 "write": true, 00:21:23.869 "unmap": false, 00:21:23.869 "flush": false, 00:21:23.869 "reset": true, 00:21:23.869 "nvme_admin": false, 00:21:23.869 "nvme_io": false, 00:21:23.869 "nvme_io_md": false, 00:21:23.869 "write_zeroes": true, 00:21:23.869 "zcopy": false, 00:21:23.869 "get_zone_info": false, 00:21:23.869 "zone_management": false, 00:21:23.869 "zone_append": false, 00:21:23.869 "compare": false, 00:21:23.869 "compare_and_write": false, 00:21:23.869 "abort": false, 00:21:23.869 "seek_hole": false, 00:21:23.869 "seek_data": false, 00:21:23.869 "copy": false, 00:21:23.869 "nvme_iov_md": false 00:21:23.869 }, 00:21:23.869 "memory_domains": [ 00:21:23.869 { 00:21:23.869 "dma_device_id": "system", 00:21:23.869 "dma_device_type": 1 00:21:23.869 }, 00:21:23.869 { 00:21:23.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.869 "dma_device_type": 2 00:21:23.869 }, 00:21:23.869 { 00:21:23.869 "dma_device_id": "system", 00:21:23.869 "dma_device_type": 1 00:21:23.869 }, 00:21:23.869 { 00:21:23.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.869 "dma_device_type": 2 00:21:23.869 }, 00:21:23.869 { 00:21:23.869 "dma_device_id": "system", 00:21:23.869 "dma_device_type": 1 00:21:23.869 }, 00:21:23.869 { 00:21:23.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.869 "dma_device_type": 2 00:21:23.869 }, 00:21:23.869 { 00:21:23.869 "dma_device_id": "system", 00:21:23.869 "dma_device_type": 1 00:21:23.869 }, 00:21:23.869 { 00:21:23.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.869 "dma_device_type": 2 00:21:23.869 } 00:21:23.869 ], 00:21:23.869 "driver_specific": { 00:21:23.869 "raid": { 00:21:23.869 "uuid": "e3a712b5-1dfb-4963-a920-0ab921a8557d", 00:21:23.869 "strip_size_kb": 0, 00:21:23.869 "state": "online", 00:21:23.869 "raid_level": "raid1", 00:21:23.869 "superblock": false, 00:21:23.869 "num_base_bdevs": 4, 00:21:23.869 "num_base_bdevs_discovered": 4, 00:21:23.869 "num_base_bdevs_operational": 4, 00:21:23.869 "base_bdevs_list": [ 00:21:23.869 { 00:21:23.869 "name": "BaseBdev1", 00:21:23.869 "uuid": "eadfae86-f3b6-4d4c-8584-2c421ded6fbc", 00:21:23.869 "is_configured": true, 00:21:23.869 "data_offset": 0, 00:21:23.869 "data_size": 65536 00:21:23.869 }, 00:21:23.869 { 00:21:23.869 "name": "BaseBdev2", 00:21:23.869 "uuid": "bfa0f8c2-b4a2-4043-a454-890b22c3ad0a", 00:21:23.869 "is_configured": true, 00:21:23.869 "data_offset": 0, 00:21:23.869 "data_size": 65536 00:21:23.869 }, 00:21:23.869 { 00:21:23.869 "name": "BaseBdev3", 00:21:23.869 "uuid": "c691951e-01b9-4b10-acc8-b339cfb0688a", 00:21:23.869 "is_configured": true, 00:21:23.869 "data_offset": 0, 00:21:23.869 "data_size": 65536 00:21:23.869 }, 00:21:23.869 { 00:21:23.869 "name": "BaseBdev4", 00:21:23.869 "uuid": "6c8acf02-4a56-47d4-bfaf-65ee5ab8cdbe", 00:21:23.869 "is_configured": true, 00:21:23.869 "data_offset": 0, 00:21:23.869 "data_size": 65536 00:21:23.869 } 00:21:23.869 ] 00:21:23.869 } 00:21:23.869 } 00:21:23.869 }' 00:21:23.869 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:24.128 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:24.128 BaseBdev2 00:21:24.128 BaseBdev3 00:21:24.128 BaseBdev4' 00:21:24.128 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:24.128 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:24.128 00:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:24.128 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:24.128 "name": "BaseBdev1", 00:21:24.128 "aliases": [ 00:21:24.128 "eadfae86-f3b6-4d4c-8584-2c421ded6fbc" 00:21:24.128 ], 00:21:24.128 "product_name": "Malloc disk", 00:21:24.128 "block_size": 512, 00:21:24.128 "num_blocks": 65536, 00:21:24.128 "uuid": "eadfae86-f3b6-4d4c-8584-2c421ded6fbc", 00:21:24.128 "assigned_rate_limits": { 00:21:24.128 "rw_ios_per_sec": 0, 00:21:24.128 "rw_mbytes_per_sec": 0, 00:21:24.128 "r_mbytes_per_sec": 0, 00:21:24.128 "w_mbytes_per_sec": 0 00:21:24.128 }, 00:21:24.128 "claimed": true, 00:21:24.128 "claim_type": "exclusive_write", 00:21:24.128 "zoned": false, 00:21:24.128 "supported_io_types": { 00:21:24.128 "read": true, 00:21:24.128 "write": true, 00:21:24.128 "unmap": true, 00:21:24.128 "flush": true, 00:21:24.128 "reset": true, 00:21:24.128 "nvme_admin": false, 00:21:24.128 "nvme_io": false, 00:21:24.128 "nvme_io_md": false, 00:21:24.128 "write_zeroes": true, 00:21:24.128 "zcopy": true, 00:21:24.128 "get_zone_info": false, 00:21:24.128 "zone_management": false, 00:21:24.128 "zone_append": false, 00:21:24.128 "compare": false, 00:21:24.128 "compare_and_write": false, 00:21:24.128 "abort": true, 00:21:24.128 "seek_hole": false, 00:21:24.128 "seek_data": false, 00:21:24.128 "copy": true, 00:21:24.128 "nvme_iov_md": false 00:21:24.128 }, 00:21:24.128 "memory_domains": [ 00:21:24.128 { 00:21:24.128 "dma_device_id": "system", 00:21:24.128 "dma_device_type": 1 00:21:24.128 }, 00:21:24.128 { 00:21:24.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.128 "dma_device_type": 2 00:21:24.128 } 00:21:24.128 ], 00:21:24.128 "driver_specific": {} 00:21:24.128 }' 00:21:24.128 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.128 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:24.388 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:24.648 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:24.648 "name": "BaseBdev2", 00:21:24.648 "aliases": [ 00:21:24.648 "bfa0f8c2-b4a2-4043-a454-890b22c3ad0a" 00:21:24.648 ], 00:21:24.648 "product_name": "Malloc disk", 00:21:24.648 "block_size": 512, 00:21:24.648 "num_blocks": 65536, 00:21:24.648 "uuid": "bfa0f8c2-b4a2-4043-a454-890b22c3ad0a", 00:21:24.648 "assigned_rate_limits": { 00:21:24.648 "rw_ios_per_sec": 0, 00:21:24.648 "rw_mbytes_per_sec": 0, 00:21:24.648 "r_mbytes_per_sec": 0, 00:21:24.648 "w_mbytes_per_sec": 0 00:21:24.648 }, 00:21:24.648 "claimed": true, 00:21:24.648 "claim_type": "exclusive_write", 00:21:24.648 "zoned": false, 00:21:24.648 "supported_io_types": { 00:21:24.648 "read": true, 00:21:24.648 "write": true, 00:21:24.648 "unmap": true, 00:21:24.648 "flush": true, 00:21:24.648 "reset": true, 00:21:24.648 "nvme_admin": false, 00:21:24.648 "nvme_io": false, 00:21:24.648 "nvme_io_md": false, 00:21:24.648 "write_zeroes": true, 00:21:24.648 "zcopy": true, 00:21:24.648 "get_zone_info": false, 00:21:24.648 "zone_management": false, 00:21:24.648 "zone_append": false, 00:21:24.648 "compare": false, 00:21:24.648 "compare_and_write": false, 00:21:24.648 "abort": true, 00:21:24.648 "seek_hole": false, 00:21:24.648 "seek_data": false, 00:21:24.648 "copy": true, 00:21:24.648 "nvme_iov_md": false 00:21:24.648 }, 00:21:24.648 "memory_domains": [ 00:21:24.648 { 00:21:24.648 "dma_device_id": "system", 00:21:24.648 "dma_device_type": 1 00:21:24.648 }, 00:21:24.648 { 00:21:24.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.648 "dma_device_type": 2 00:21:24.648 } 00:21:24.648 ], 00:21:24.648 "driver_specific": {} 00:21:24.648 }' 00:21:24.648 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.648 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.648 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:24.648 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.906 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.906 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:24.906 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:24.906 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:24.906 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:24.906 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:24.906 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:25.165 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:25.165 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:25.165 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:25.165 00:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:25.732 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:25.732 "name": "BaseBdev3", 00:21:25.732 "aliases": [ 00:21:25.732 "c691951e-01b9-4b10-acc8-b339cfb0688a" 00:21:25.732 ], 00:21:25.732 "product_name": "Malloc disk", 00:21:25.732 "block_size": 512, 00:21:25.732 "num_blocks": 65536, 00:21:25.732 "uuid": "c691951e-01b9-4b10-acc8-b339cfb0688a", 00:21:25.732 "assigned_rate_limits": { 00:21:25.732 "rw_ios_per_sec": 0, 00:21:25.732 "rw_mbytes_per_sec": 0, 00:21:25.732 "r_mbytes_per_sec": 0, 00:21:25.732 "w_mbytes_per_sec": 0 00:21:25.732 }, 00:21:25.732 "claimed": true, 00:21:25.732 "claim_type": "exclusive_write", 00:21:25.732 "zoned": false, 00:21:25.732 "supported_io_types": { 00:21:25.732 "read": true, 00:21:25.732 "write": true, 00:21:25.732 "unmap": true, 00:21:25.732 "flush": true, 00:21:25.732 "reset": true, 00:21:25.732 "nvme_admin": false, 00:21:25.732 "nvme_io": false, 00:21:25.732 "nvme_io_md": false, 00:21:25.732 "write_zeroes": true, 00:21:25.732 "zcopy": true, 00:21:25.732 "get_zone_info": false, 00:21:25.732 "zone_management": false, 00:21:25.732 "zone_append": false, 00:21:25.732 "compare": false, 00:21:25.732 "compare_and_write": false, 00:21:25.732 "abort": true, 00:21:25.732 "seek_hole": false, 00:21:25.732 "seek_data": false, 00:21:25.732 "copy": true, 00:21:25.732 "nvme_iov_md": false 00:21:25.732 }, 00:21:25.732 "memory_domains": [ 00:21:25.732 { 00:21:25.732 "dma_device_id": "system", 00:21:25.732 "dma_device_type": 1 00:21:25.732 }, 00:21:25.732 { 00:21:25.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:25.732 "dma_device_type": 2 00:21:25.732 } 00:21:25.732 ], 00:21:25.732 "driver_specific": {} 00:21:25.732 }' 00:21:25.732 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:25.732 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:25.732 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:25.732 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:25.732 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:25.732 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:25.732 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:25.732 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.027 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:26.027 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.027 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.027 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:26.027 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:26.027 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:26.027 00:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:26.594 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:26.595 "name": "BaseBdev4", 00:21:26.595 "aliases": [ 00:21:26.595 "6c8acf02-4a56-47d4-bfaf-65ee5ab8cdbe" 00:21:26.595 ], 00:21:26.595 "product_name": "Malloc disk", 00:21:26.595 "block_size": 512, 00:21:26.595 "num_blocks": 65536, 00:21:26.595 "uuid": "6c8acf02-4a56-47d4-bfaf-65ee5ab8cdbe", 00:21:26.595 "assigned_rate_limits": { 00:21:26.595 "rw_ios_per_sec": 0, 00:21:26.595 "rw_mbytes_per_sec": 0, 00:21:26.595 "r_mbytes_per_sec": 0, 00:21:26.595 "w_mbytes_per_sec": 0 00:21:26.595 }, 00:21:26.595 "claimed": true, 00:21:26.595 "claim_type": "exclusive_write", 00:21:26.595 "zoned": false, 00:21:26.595 "supported_io_types": { 00:21:26.595 "read": true, 00:21:26.595 "write": true, 00:21:26.595 "unmap": true, 00:21:26.595 "flush": true, 00:21:26.595 "reset": true, 00:21:26.595 "nvme_admin": false, 00:21:26.595 "nvme_io": false, 00:21:26.595 "nvme_io_md": false, 00:21:26.595 "write_zeroes": true, 00:21:26.595 "zcopy": true, 00:21:26.595 "get_zone_info": false, 00:21:26.595 "zone_management": false, 00:21:26.595 "zone_append": false, 00:21:26.595 "compare": false, 00:21:26.595 "compare_and_write": false, 00:21:26.595 "abort": true, 00:21:26.595 "seek_hole": false, 00:21:26.595 "seek_data": false, 00:21:26.595 "copy": true, 00:21:26.595 "nvme_iov_md": false 00:21:26.595 }, 00:21:26.595 "memory_domains": [ 00:21:26.595 { 00:21:26.595 "dma_device_id": "system", 00:21:26.595 "dma_device_type": 1 00:21:26.595 }, 00:21:26.595 { 00:21:26.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.595 "dma_device_type": 2 00:21:26.595 } 00:21:26.595 ], 00:21:26.595 "driver_specific": {} 00:21:26.595 }' 00:21:26.595 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.595 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.595 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:26.595 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.595 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.595 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:26.595 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.853 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.853 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:26.853 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.853 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.853 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:26.853 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:27.131 [2024-07-16 00:16:13.932873] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.131 00:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:27.389 00:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.389 "name": "Existed_Raid", 00:21:27.389 "uuid": "e3a712b5-1dfb-4963-a920-0ab921a8557d", 00:21:27.389 "strip_size_kb": 0, 00:21:27.389 "state": "online", 00:21:27.389 "raid_level": "raid1", 00:21:27.389 "superblock": false, 00:21:27.389 "num_base_bdevs": 4, 00:21:27.389 "num_base_bdevs_discovered": 3, 00:21:27.389 "num_base_bdevs_operational": 3, 00:21:27.389 "base_bdevs_list": [ 00:21:27.389 { 00:21:27.389 "name": null, 00:21:27.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.389 "is_configured": false, 00:21:27.389 "data_offset": 0, 00:21:27.389 "data_size": 65536 00:21:27.389 }, 00:21:27.389 { 00:21:27.389 "name": "BaseBdev2", 00:21:27.389 "uuid": "bfa0f8c2-b4a2-4043-a454-890b22c3ad0a", 00:21:27.389 "is_configured": true, 00:21:27.389 "data_offset": 0, 00:21:27.389 "data_size": 65536 00:21:27.389 }, 00:21:27.389 { 00:21:27.389 "name": "BaseBdev3", 00:21:27.389 "uuid": "c691951e-01b9-4b10-acc8-b339cfb0688a", 00:21:27.389 "is_configured": true, 00:21:27.389 "data_offset": 0, 00:21:27.389 "data_size": 65536 00:21:27.389 }, 00:21:27.389 { 00:21:27.389 "name": "BaseBdev4", 00:21:27.389 "uuid": "6c8acf02-4a56-47d4-bfaf-65ee5ab8cdbe", 00:21:27.389 "is_configured": true, 00:21:27.389 "data_offset": 0, 00:21:27.389 "data_size": 65536 00:21:27.389 } 00:21:27.389 ] 00:21:27.389 }' 00:21:27.389 00:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.389 00:16:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:27.955 00:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:27.955 00:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:27.955 00:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.955 00:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:28.212 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:28.212 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:28.212 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:28.468 [2024-07-16 00:16:15.285438] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:28.468 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:28.468 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:28.468 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:28.468 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.726 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:28.726 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:28.726 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:28.984 [2024-07-16 00:16:15.805414] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:28.984 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:28.984 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:28.984 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.984 00:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:29.242 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:29.242 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:29.242 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:29.500 [2024-07-16 00:16:16.310757] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:29.500 [2024-07-16 00:16:16.310829] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:29.500 [2024-07-16 00:16:16.323500] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:29.500 [2024-07-16 00:16:16.323536] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:29.500 [2024-07-16 00:16:16.323546] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250d350 name Existed_Raid, state offline 00:21:29.500 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:29.500 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:29.500 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.500 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:29.757 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:29.757 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:29.757 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:29.757 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:29.757 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:29.757 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:30.015 BaseBdev2 00:21:30.015 00:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:30.015 00:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:30.015 00:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:30.015 00:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:30.015 00:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:30.015 00:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:30.015 00:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:30.272 00:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:30.529 [ 00:21:30.529 { 00:21:30.529 "name": "BaseBdev2", 00:21:30.529 "aliases": [ 00:21:30.529 "f635ea19-768b-45cf-a9af-92076240f348" 00:21:30.529 ], 00:21:30.529 "product_name": "Malloc disk", 00:21:30.529 "block_size": 512, 00:21:30.529 "num_blocks": 65536, 00:21:30.529 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:30.529 "assigned_rate_limits": { 00:21:30.529 "rw_ios_per_sec": 0, 00:21:30.529 "rw_mbytes_per_sec": 0, 00:21:30.530 "r_mbytes_per_sec": 0, 00:21:30.530 "w_mbytes_per_sec": 0 00:21:30.530 }, 00:21:30.530 "claimed": false, 00:21:30.530 "zoned": false, 00:21:30.530 "supported_io_types": { 00:21:30.530 "read": true, 00:21:30.530 "write": true, 00:21:30.530 "unmap": true, 00:21:30.530 "flush": true, 00:21:30.530 "reset": true, 00:21:30.530 "nvme_admin": false, 00:21:30.530 "nvme_io": false, 00:21:30.530 "nvme_io_md": false, 00:21:30.530 "write_zeroes": true, 00:21:30.530 "zcopy": true, 00:21:30.530 "get_zone_info": false, 00:21:30.530 "zone_management": false, 00:21:30.530 "zone_append": false, 00:21:30.530 "compare": false, 00:21:30.530 "compare_and_write": false, 00:21:30.530 "abort": true, 00:21:30.530 "seek_hole": false, 00:21:30.530 "seek_data": false, 00:21:30.530 "copy": true, 00:21:30.530 "nvme_iov_md": false 00:21:30.530 }, 00:21:30.530 "memory_domains": [ 00:21:30.530 { 00:21:30.530 "dma_device_id": "system", 00:21:30.530 "dma_device_type": 1 00:21:30.530 }, 00:21:30.530 { 00:21:30.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.530 "dma_device_type": 2 00:21:30.530 } 00:21:30.530 ], 00:21:30.530 "driver_specific": {} 00:21:30.530 } 00:21:30.530 ] 00:21:30.530 00:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:30.530 00:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:30.530 00:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:30.530 00:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:30.789 BaseBdev3 00:21:30.789 00:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:30.789 00:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:30.789 00:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:30.789 00:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:30.789 00:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:30.789 00:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:30.789 00:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:31.047 00:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:31.307 [ 00:21:31.307 { 00:21:31.307 "name": "BaseBdev3", 00:21:31.307 "aliases": [ 00:21:31.307 "6b0f7747-9934-4d50-b380-f6cc950a2995" 00:21:31.307 ], 00:21:31.307 "product_name": "Malloc disk", 00:21:31.307 "block_size": 512, 00:21:31.307 "num_blocks": 65536, 00:21:31.307 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:31.307 "assigned_rate_limits": { 00:21:31.307 "rw_ios_per_sec": 0, 00:21:31.307 "rw_mbytes_per_sec": 0, 00:21:31.307 "r_mbytes_per_sec": 0, 00:21:31.307 "w_mbytes_per_sec": 0 00:21:31.307 }, 00:21:31.307 "claimed": false, 00:21:31.307 "zoned": false, 00:21:31.307 "supported_io_types": { 00:21:31.307 "read": true, 00:21:31.307 "write": true, 00:21:31.307 "unmap": true, 00:21:31.307 "flush": true, 00:21:31.307 "reset": true, 00:21:31.307 "nvme_admin": false, 00:21:31.307 "nvme_io": false, 00:21:31.307 "nvme_io_md": false, 00:21:31.307 "write_zeroes": true, 00:21:31.307 "zcopy": true, 00:21:31.307 "get_zone_info": false, 00:21:31.307 "zone_management": false, 00:21:31.307 "zone_append": false, 00:21:31.307 "compare": false, 00:21:31.307 "compare_and_write": false, 00:21:31.307 "abort": true, 00:21:31.307 "seek_hole": false, 00:21:31.307 "seek_data": false, 00:21:31.307 "copy": true, 00:21:31.307 "nvme_iov_md": false 00:21:31.307 }, 00:21:31.307 "memory_domains": [ 00:21:31.307 { 00:21:31.307 "dma_device_id": "system", 00:21:31.307 "dma_device_type": 1 00:21:31.307 }, 00:21:31.307 { 00:21:31.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.307 "dma_device_type": 2 00:21:31.307 } 00:21:31.307 ], 00:21:31.307 "driver_specific": {} 00:21:31.307 } 00:21:31.307 ] 00:21:31.307 00:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:31.307 00:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:31.307 00:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:31.307 00:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:31.567 BaseBdev4 00:21:31.567 00:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:31.567 00:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:31.567 00:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:31.567 00:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:31.567 00:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:31.567 00:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:31.567 00:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:31.567 00:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:31.826 [ 00:21:31.826 { 00:21:31.826 "name": "BaseBdev4", 00:21:31.826 "aliases": [ 00:21:31.826 "60762769-15b3-4c21-85f3-b5f9f1ab945d" 00:21:31.826 ], 00:21:31.826 "product_name": "Malloc disk", 00:21:31.826 "block_size": 512, 00:21:31.826 "num_blocks": 65536, 00:21:31.826 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:31.826 "assigned_rate_limits": { 00:21:31.826 "rw_ios_per_sec": 0, 00:21:31.826 "rw_mbytes_per_sec": 0, 00:21:31.826 "r_mbytes_per_sec": 0, 00:21:31.826 "w_mbytes_per_sec": 0 00:21:31.826 }, 00:21:31.826 "claimed": false, 00:21:31.826 "zoned": false, 00:21:31.826 "supported_io_types": { 00:21:31.826 "read": true, 00:21:31.826 "write": true, 00:21:31.826 "unmap": true, 00:21:31.826 "flush": true, 00:21:31.826 "reset": true, 00:21:31.826 "nvme_admin": false, 00:21:31.826 "nvme_io": false, 00:21:31.826 "nvme_io_md": false, 00:21:31.826 "write_zeroes": true, 00:21:31.826 "zcopy": true, 00:21:31.826 "get_zone_info": false, 00:21:31.826 "zone_management": false, 00:21:31.826 "zone_append": false, 00:21:31.826 "compare": false, 00:21:31.826 "compare_and_write": false, 00:21:31.826 "abort": true, 00:21:31.826 "seek_hole": false, 00:21:31.826 "seek_data": false, 00:21:31.826 "copy": true, 00:21:31.826 "nvme_iov_md": false 00:21:31.826 }, 00:21:31.826 "memory_domains": [ 00:21:31.826 { 00:21:31.826 "dma_device_id": "system", 00:21:31.826 "dma_device_type": 1 00:21:31.826 }, 00:21:31.826 { 00:21:31.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.826 "dma_device_type": 2 00:21:31.826 } 00:21:31.826 ], 00:21:31.826 "driver_specific": {} 00:21:31.826 } 00:21:31.826 ] 00:21:32.085 00:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:32.085 00:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:32.085 00:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:32.085 00:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:32.085 [2024-07-16 00:16:19.008844] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:32.085 [2024-07-16 00:16:19.008886] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:32.085 [2024-07-16 00:16:19.008904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:32.085 [2024-07-16 00:16:19.010243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:32.085 [2024-07-16 00:16:19.010284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.085 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:32.344 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:32.344 "name": "Existed_Raid", 00:21:32.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.344 "strip_size_kb": 0, 00:21:32.344 "state": "configuring", 00:21:32.344 "raid_level": "raid1", 00:21:32.344 "superblock": false, 00:21:32.344 "num_base_bdevs": 4, 00:21:32.344 "num_base_bdevs_discovered": 3, 00:21:32.344 "num_base_bdevs_operational": 4, 00:21:32.344 "base_bdevs_list": [ 00:21:32.344 { 00:21:32.344 "name": "BaseBdev1", 00:21:32.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.344 "is_configured": false, 00:21:32.344 "data_offset": 0, 00:21:32.344 "data_size": 0 00:21:32.344 }, 00:21:32.344 { 00:21:32.344 "name": "BaseBdev2", 00:21:32.344 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:32.344 "is_configured": true, 00:21:32.344 "data_offset": 0, 00:21:32.344 "data_size": 65536 00:21:32.344 }, 00:21:32.344 { 00:21:32.344 "name": "BaseBdev3", 00:21:32.344 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:32.344 "is_configured": true, 00:21:32.344 "data_offset": 0, 00:21:32.344 "data_size": 65536 00:21:32.344 }, 00:21:32.344 { 00:21:32.344 "name": "BaseBdev4", 00:21:32.344 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:32.344 "is_configured": true, 00:21:32.344 "data_offset": 0, 00:21:32.344 "data_size": 65536 00:21:32.344 } 00:21:32.344 ] 00:21:32.344 }' 00:21:32.344 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:32.344 00:16:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:33.280 00:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:33.280 [2024-07-16 00:16:20.107745] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.280 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:33.540 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.540 "name": "Existed_Raid", 00:21:33.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.540 "strip_size_kb": 0, 00:21:33.540 "state": "configuring", 00:21:33.540 "raid_level": "raid1", 00:21:33.540 "superblock": false, 00:21:33.540 "num_base_bdevs": 4, 00:21:33.540 "num_base_bdevs_discovered": 2, 00:21:33.540 "num_base_bdevs_operational": 4, 00:21:33.540 "base_bdevs_list": [ 00:21:33.540 { 00:21:33.540 "name": "BaseBdev1", 00:21:33.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.540 "is_configured": false, 00:21:33.540 "data_offset": 0, 00:21:33.540 "data_size": 0 00:21:33.540 }, 00:21:33.540 { 00:21:33.540 "name": null, 00:21:33.540 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:33.540 "is_configured": false, 00:21:33.540 "data_offset": 0, 00:21:33.540 "data_size": 65536 00:21:33.540 }, 00:21:33.540 { 00:21:33.540 "name": "BaseBdev3", 00:21:33.540 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:33.540 "is_configured": true, 00:21:33.540 "data_offset": 0, 00:21:33.540 "data_size": 65536 00:21:33.540 }, 00:21:33.540 { 00:21:33.540 "name": "BaseBdev4", 00:21:33.540 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:33.540 "is_configured": true, 00:21:33.540 "data_offset": 0, 00:21:33.540 "data_size": 65536 00:21:33.540 } 00:21:33.540 ] 00:21:33.540 }' 00:21:33.540 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.540 00:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:34.107 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:34.107 00:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.366 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:34.367 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:34.626 [2024-07-16 00:16:21.470736] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:34.626 BaseBdev1 00:21:34.626 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:34.626 00:16:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:34.626 00:16:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:34.626 00:16:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:34.626 00:16:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:34.626 00:16:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:34.626 00:16:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:34.885 00:16:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:35.143 [ 00:21:35.143 { 00:21:35.143 "name": "BaseBdev1", 00:21:35.143 "aliases": [ 00:21:35.143 "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a" 00:21:35.143 ], 00:21:35.143 "product_name": "Malloc disk", 00:21:35.143 "block_size": 512, 00:21:35.143 "num_blocks": 65536, 00:21:35.143 "uuid": "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a", 00:21:35.143 "assigned_rate_limits": { 00:21:35.143 "rw_ios_per_sec": 0, 00:21:35.143 "rw_mbytes_per_sec": 0, 00:21:35.143 "r_mbytes_per_sec": 0, 00:21:35.143 "w_mbytes_per_sec": 0 00:21:35.143 }, 00:21:35.143 "claimed": true, 00:21:35.143 "claim_type": "exclusive_write", 00:21:35.143 "zoned": false, 00:21:35.143 "supported_io_types": { 00:21:35.143 "read": true, 00:21:35.143 "write": true, 00:21:35.143 "unmap": true, 00:21:35.143 "flush": true, 00:21:35.143 "reset": true, 00:21:35.143 "nvme_admin": false, 00:21:35.143 "nvme_io": false, 00:21:35.143 "nvme_io_md": false, 00:21:35.143 "write_zeroes": true, 00:21:35.143 "zcopy": true, 00:21:35.143 "get_zone_info": false, 00:21:35.143 "zone_management": false, 00:21:35.143 "zone_append": false, 00:21:35.143 "compare": false, 00:21:35.143 "compare_and_write": false, 00:21:35.143 "abort": true, 00:21:35.143 "seek_hole": false, 00:21:35.143 "seek_data": false, 00:21:35.143 "copy": true, 00:21:35.143 "nvme_iov_md": false 00:21:35.143 }, 00:21:35.143 "memory_domains": [ 00:21:35.143 { 00:21:35.143 "dma_device_id": "system", 00:21:35.143 "dma_device_type": 1 00:21:35.143 }, 00:21:35.143 { 00:21:35.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.143 "dma_device_type": 2 00:21:35.143 } 00:21:35.143 ], 00:21:35.143 "driver_specific": {} 00:21:35.143 } 00:21:35.143 ] 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.143 00:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:35.401 00:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:35.401 "name": "Existed_Raid", 00:21:35.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.401 "strip_size_kb": 0, 00:21:35.401 "state": "configuring", 00:21:35.401 "raid_level": "raid1", 00:21:35.401 "superblock": false, 00:21:35.401 "num_base_bdevs": 4, 00:21:35.401 "num_base_bdevs_discovered": 3, 00:21:35.401 "num_base_bdevs_operational": 4, 00:21:35.401 "base_bdevs_list": [ 00:21:35.401 { 00:21:35.401 "name": "BaseBdev1", 00:21:35.401 "uuid": "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a", 00:21:35.401 "is_configured": true, 00:21:35.401 "data_offset": 0, 00:21:35.401 "data_size": 65536 00:21:35.401 }, 00:21:35.401 { 00:21:35.401 "name": null, 00:21:35.401 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:35.401 "is_configured": false, 00:21:35.401 "data_offset": 0, 00:21:35.401 "data_size": 65536 00:21:35.401 }, 00:21:35.401 { 00:21:35.401 "name": "BaseBdev3", 00:21:35.401 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:35.401 "is_configured": true, 00:21:35.401 "data_offset": 0, 00:21:35.401 "data_size": 65536 00:21:35.401 }, 00:21:35.401 { 00:21:35.401 "name": "BaseBdev4", 00:21:35.401 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:35.401 "is_configured": true, 00:21:35.401 "data_offset": 0, 00:21:35.401 "data_size": 65536 00:21:35.401 } 00:21:35.401 ] 00:21:35.401 }' 00:21:35.401 00:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:35.401 00:16:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:35.968 00:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:35.968 00:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.228 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:36.228 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:36.491 [2024-07-16 00:16:23.287587] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.491 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:36.751 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:36.751 "name": "Existed_Raid", 00:21:36.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.751 "strip_size_kb": 0, 00:21:36.751 "state": "configuring", 00:21:36.751 "raid_level": "raid1", 00:21:36.751 "superblock": false, 00:21:36.751 "num_base_bdevs": 4, 00:21:36.751 "num_base_bdevs_discovered": 2, 00:21:36.751 "num_base_bdevs_operational": 4, 00:21:36.751 "base_bdevs_list": [ 00:21:36.751 { 00:21:36.751 "name": "BaseBdev1", 00:21:36.751 "uuid": "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a", 00:21:36.751 "is_configured": true, 00:21:36.751 "data_offset": 0, 00:21:36.751 "data_size": 65536 00:21:36.751 }, 00:21:36.751 { 00:21:36.751 "name": null, 00:21:36.751 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:36.751 "is_configured": false, 00:21:36.751 "data_offset": 0, 00:21:36.751 "data_size": 65536 00:21:36.751 }, 00:21:36.751 { 00:21:36.751 "name": null, 00:21:36.751 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:36.751 "is_configured": false, 00:21:36.751 "data_offset": 0, 00:21:36.751 "data_size": 65536 00:21:36.751 }, 00:21:36.751 { 00:21:36.751 "name": "BaseBdev4", 00:21:36.751 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:36.751 "is_configured": true, 00:21:36.751 "data_offset": 0, 00:21:36.751 "data_size": 65536 00:21:36.751 } 00:21:36.751 ] 00:21:36.751 }' 00:21:36.751 00:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:36.751 00:16:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:37.318 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.318 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:37.576 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:37.576 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:37.835 [2024-07-16 00:16:24.611190] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.835 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:38.094 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:38.094 "name": "Existed_Raid", 00:21:38.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.094 "strip_size_kb": 0, 00:21:38.094 "state": "configuring", 00:21:38.094 "raid_level": "raid1", 00:21:38.094 "superblock": false, 00:21:38.094 "num_base_bdevs": 4, 00:21:38.094 "num_base_bdevs_discovered": 3, 00:21:38.094 "num_base_bdevs_operational": 4, 00:21:38.094 "base_bdevs_list": [ 00:21:38.094 { 00:21:38.094 "name": "BaseBdev1", 00:21:38.094 "uuid": "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a", 00:21:38.094 "is_configured": true, 00:21:38.094 "data_offset": 0, 00:21:38.094 "data_size": 65536 00:21:38.094 }, 00:21:38.094 { 00:21:38.094 "name": null, 00:21:38.094 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:38.094 "is_configured": false, 00:21:38.094 "data_offset": 0, 00:21:38.094 "data_size": 65536 00:21:38.094 }, 00:21:38.094 { 00:21:38.094 "name": "BaseBdev3", 00:21:38.094 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:38.094 "is_configured": true, 00:21:38.094 "data_offset": 0, 00:21:38.094 "data_size": 65536 00:21:38.094 }, 00:21:38.094 { 00:21:38.094 "name": "BaseBdev4", 00:21:38.094 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:38.094 "is_configured": true, 00:21:38.094 "data_offset": 0, 00:21:38.094 "data_size": 65536 00:21:38.094 } 00:21:38.094 ] 00:21:38.094 }' 00:21:38.094 00:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:38.094 00:16:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:38.660 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.660 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:38.919 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:38.919 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:39.179 [2024-07-16 00:16:25.946760] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.179 00:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:39.438 00:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.438 "name": "Existed_Raid", 00:21:39.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.438 "strip_size_kb": 0, 00:21:39.438 "state": "configuring", 00:21:39.438 "raid_level": "raid1", 00:21:39.438 "superblock": false, 00:21:39.438 "num_base_bdevs": 4, 00:21:39.438 "num_base_bdevs_discovered": 2, 00:21:39.438 "num_base_bdevs_operational": 4, 00:21:39.438 "base_bdevs_list": [ 00:21:39.438 { 00:21:39.438 "name": null, 00:21:39.438 "uuid": "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a", 00:21:39.438 "is_configured": false, 00:21:39.438 "data_offset": 0, 00:21:39.438 "data_size": 65536 00:21:39.438 }, 00:21:39.438 { 00:21:39.438 "name": null, 00:21:39.438 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:39.438 "is_configured": false, 00:21:39.438 "data_offset": 0, 00:21:39.438 "data_size": 65536 00:21:39.438 }, 00:21:39.438 { 00:21:39.438 "name": "BaseBdev3", 00:21:39.438 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:39.438 "is_configured": true, 00:21:39.438 "data_offset": 0, 00:21:39.438 "data_size": 65536 00:21:39.438 }, 00:21:39.438 { 00:21:39.438 "name": "BaseBdev4", 00:21:39.438 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:39.438 "is_configured": true, 00:21:39.438 "data_offset": 0, 00:21:39.438 "data_size": 65536 00:21:39.438 } 00:21:39.438 ] 00:21:39.438 }' 00:21:39.438 00:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.438 00:16:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.007 00:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.007 00:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:40.304 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:40.304 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:40.563 [2024-07-16 00:16:27.302799] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.563 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:40.823 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.823 "name": "Existed_Raid", 00:21:40.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.823 "strip_size_kb": 0, 00:21:40.823 "state": "configuring", 00:21:40.823 "raid_level": "raid1", 00:21:40.823 "superblock": false, 00:21:40.823 "num_base_bdevs": 4, 00:21:40.823 "num_base_bdevs_discovered": 3, 00:21:40.823 "num_base_bdevs_operational": 4, 00:21:40.823 "base_bdevs_list": [ 00:21:40.823 { 00:21:40.823 "name": null, 00:21:40.823 "uuid": "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a", 00:21:40.823 "is_configured": false, 00:21:40.823 "data_offset": 0, 00:21:40.823 "data_size": 65536 00:21:40.823 }, 00:21:40.823 { 00:21:40.823 "name": "BaseBdev2", 00:21:40.823 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:40.823 "is_configured": true, 00:21:40.823 "data_offset": 0, 00:21:40.823 "data_size": 65536 00:21:40.823 }, 00:21:40.823 { 00:21:40.823 "name": "BaseBdev3", 00:21:40.823 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:40.823 "is_configured": true, 00:21:40.823 "data_offset": 0, 00:21:40.823 "data_size": 65536 00:21:40.823 }, 00:21:40.823 { 00:21:40.823 "name": "BaseBdev4", 00:21:40.823 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:40.823 "is_configured": true, 00:21:40.823 "data_offset": 0, 00:21:40.823 "data_size": 65536 00:21:40.823 } 00:21:40.823 ] 00:21:40.823 }' 00:21:40.823 00:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.823 00:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:41.391 00:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.391 00:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:41.651 00:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:41.651 00:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.651 00:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:41.910 00:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a 00:21:42.168 [2024-07-16 00:16:28.890324] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:42.168 [2024-07-16 00:16:28.890363] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x250b610 00:21:42.168 [2024-07-16 00:16:28.890372] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:42.168 [2024-07-16 00:16:28.890564] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x250ca70 00:21:42.168 [2024-07-16 00:16:28.890687] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x250b610 00:21:42.168 [2024-07-16 00:16:28.890697] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x250b610 00:21:42.168 [2024-07-16 00:16:28.890858] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:42.168 NewBaseBdev 00:21:42.168 00:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:42.168 00:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:42.168 00:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:42.168 00:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:42.168 00:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:42.168 00:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:42.168 00:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:42.427 00:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:42.686 [ 00:21:42.686 { 00:21:42.686 "name": "NewBaseBdev", 00:21:42.686 "aliases": [ 00:21:42.686 "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a" 00:21:42.686 ], 00:21:42.686 "product_name": "Malloc disk", 00:21:42.686 "block_size": 512, 00:21:42.686 "num_blocks": 65536, 00:21:42.686 "uuid": "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a", 00:21:42.686 "assigned_rate_limits": { 00:21:42.686 "rw_ios_per_sec": 0, 00:21:42.686 "rw_mbytes_per_sec": 0, 00:21:42.686 "r_mbytes_per_sec": 0, 00:21:42.686 "w_mbytes_per_sec": 0 00:21:42.686 }, 00:21:42.686 "claimed": true, 00:21:42.686 "claim_type": "exclusive_write", 00:21:42.686 "zoned": false, 00:21:42.686 "supported_io_types": { 00:21:42.686 "read": true, 00:21:42.686 "write": true, 00:21:42.686 "unmap": true, 00:21:42.686 "flush": true, 00:21:42.686 "reset": true, 00:21:42.686 "nvme_admin": false, 00:21:42.686 "nvme_io": false, 00:21:42.686 "nvme_io_md": false, 00:21:42.686 "write_zeroes": true, 00:21:42.686 "zcopy": true, 00:21:42.686 "get_zone_info": false, 00:21:42.686 "zone_management": false, 00:21:42.686 "zone_append": false, 00:21:42.686 "compare": false, 00:21:42.686 "compare_and_write": false, 00:21:42.686 "abort": true, 00:21:42.686 "seek_hole": false, 00:21:42.686 "seek_data": false, 00:21:42.686 "copy": true, 00:21:42.686 "nvme_iov_md": false 00:21:42.686 }, 00:21:42.686 "memory_domains": [ 00:21:42.686 { 00:21:42.686 "dma_device_id": "system", 00:21:42.686 "dma_device_type": 1 00:21:42.686 }, 00:21:42.686 { 00:21:42.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.686 "dma_device_type": 2 00:21:42.686 } 00:21:42.686 ], 00:21:42.686 "driver_specific": {} 00:21:42.686 } 00:21:42.686 ] 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.686 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:42.945 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.945 "name": "Existed_Raid", 00:21:42.945 "uuid": "99e8ecdf-130a-46b0-9891-4a4eccb99bcf", 00:21:42.945 "strip_size_kb": 0, 00:21:42.945 "state": "online", 00:21:42.946 "raid_level": "raid1", 00:21:42.946 "superblock": false, 00:21:42.946 "num_base_bdevs": 4, 00:21:42.946 "num_base_bdevs_discovered": 4, 00:21:42.946 "num_base_bdevs_operational": 4, 00:21:42.946 "base_bdevs_list": [ 00:21:42.946 { 00:21:42.946 "name": "NewBaseBdev", 00:21:42.946 "uuid": "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a", 00:21:42.946 "is_configured": true, 00:21:42.946 "data_offset": 0, 00:21:42.946 "data_size": 65536 00:21:42.946 }, 00:21:42.946 { 00:21:42.946 "name": "BaseBdev2", 00:21:42.946 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:42.946 "is_configured": true, 00:21:42.946 "data_offset": 0, 00:21:42.946 "data_size": 65536 00:21:42.946 }, 00:21:42.946 { 00:21:42.946 "name": "BaseBdev3", 00:21:42.946 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:42.946 "is_configured": true, 00:21:42.946 "data_offset": 0, 00:21:42.946 "data_size": 65536 00:21:42.946 }, 00:21:42.946 { 00:21:42.946 "name": "BaseBdev4", 00:21:42.946 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:42.946 "is_configured": true, 00:21:42.946 "data_offset": 0, 00:21:42.946 "data_size": 65536 00:21:42.946 } 00:21:42.946 ] 00:21:42.946 }' 00:21:42.946 00:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.946 00:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:43.514 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:43.514 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:43.514 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:43.514 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:43.514 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:43.514 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:43.514 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:43.514 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:43.773 [2024-07-16 00:16:30.486879] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:43.773 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:43.773 "name": "Existed_Raid", 00:21:43.773 "aliases": [ 00:21:43.773 "99e8ecdf-130a-46b0-9891-4a4eccb99bcf" 00:21:43.773 ], 00:21:43.773 "product_name": "Raid Volume", 00:21:43.773 "block_size": 512, 00:21:43.773 "num_blocks": 65536, 00:21:43.773 "uuid": "99e8ecdf-130a-46b0-9891-4a4eccb99bcf", 00:21:43.773 "assigned_rate_limits": { 00:21:43.773 "rw_ios_per_sec": 0, 00:21:43.773 "rw_mbytes_per_sec": 0, 00:21:43.773 "r_mbytes_per_sec": 0, 00:21:43.773 "w_mbytes_per_sec": 0 00:21:43.773 }, 00:21:43.773 "claimed": false, 00:21:43.773 "zoned": false, 00:21:43.773 "supported_io_types": { 00:21:43.773 "read": true, 00:21:43.773 "write": true, 00:21:43.773 "unmap": false, 00:21:43.773 "flush": false, 00:21:43.773 "reset": true, 00:21:43.773 "nvme_admin": false, 00:21:43.773 "nvme_io": false, 00:21:43.773 "nvme_io_md": false, 00:21:43.773 "write_zeroes": true, 00:21:43.773 "zcopy": false, 00:21:43.773 "get_zone_info": false, 00:21:43.773 "zone_management": false, 00:21:43.773 "zone_append": false, 00:21:43.773 "compare": false, 00:21:43.773 "compare_and_write": false, 00:21:43.773 "abort": false, 00:21:43.773 "seek_hole": false, 00:21:43.773 "seek_data": false, 00:21:43.773 "copy": false, 00:21:43.773 "nvme_iov_md": false 00:21:43.773 }, 00:21:43.773 "memory_domains": [ 00:21:43.773 { 00:21:43.773 "dma_device_id": "system", 00:21:43.773 "dma_device_type": 1 00:21:43.773 }, 00:21:43.773 { 00:21:43.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.773 "dma_device_type": 2 00:21:43.773 }, 00:21:43.773 { 00:21:43.773 "dma_device_id": "system", 00:21:43.773 "dma_device_type": 1 00:21:43.773 }, 00:21:43.773 { 00:21:43.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.773 "dma_device_type": 2 00:21:43.773 }, 00:21:43.773 { 00:21:43.773 "dma_device_id": "system", 00:21:43.773 "dma_device_type": 1 00:21:43.773 }, 00:21:43.773 { 00:21:43.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.773 "dma_device_type": 2 00:21:43.773 }, 00:21:43.773 { 00:21:43.773 "dma_device_id": "system", 00:21:43.773 "dma_device_type": 1 00:21:43.773 }, 00:21:43.773 { 00:21:43.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.773 "dma_device_type": 2 00:21:43.773 } 00:21:43.773 ], 00:21:43.773 "driver_specific": { 00:21:43.773 "raid": { 00:21:43.773 "uuid": "99e8ecdf-130a-46b0-9891-4a4eccb99bcf", 00:21:43.773 "strip_size_kb": 0, 00:21:43.773 "state": "online", 00:21:43.773 "raid_level": "raid1", 00:21:43.773 "superblock": false, 00:21:43.773 "num_base_bdevs": 4, 00:21:43.773 "num_base_bdevs_discovered": 4, 00:21:43.773 "num_base_bdevs_operational": 4, 00:21:43.773 "base_bdevs_list": [ 00:21:43.773 { 00:21:43.774 "name": "NewBaseBdev", 00:21:43.774 "uuid": "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a", 00:21:43.774 "is_configured": true, 00:21:43.774 "data_offset": 0, 00:21:43.774 "data_size": 65536 00:21:43.774 }, 00:21:43.774 { 00:21:43.774 "name": "BaseBdev2", 00:21:43.774 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:43.774 "is_configured": true, 00:21:43.774 "data_offset": 0, 00:21:43.774 "data_size": 65536 00:21:43.774 }, 00:21:43.774 { 00:21:43.774 "name": "BaseBdev3", 00:21:43.774 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:43.774 "is_configured": true, 00:21:43.774 "data_offset": 0, 00:21:43.774 "data_size": 65536 00:21:43.774 }, 00:21:43.774 { 00:21:43.774 "name": "BaseBdev4", 00:21:43.774 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:43.774 "is_configured": true, 00:21:43.774 "data_offset": 0, 00:21:43.774 "data_size": 65536 00:21:43.774 } 00:21:43.774 ] 00:21:43.774 } 00:21:43.774 } 00:21:43.774 }' 00:21:43.774 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:43.774 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:43.774 BaseBdev2 00:21:43.774 BaseBdev3 00:21:43.774 BaseBdev4' 00:21:43.774 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:43.774 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:43.774 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:44.033 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:44.033 "name": "NewBaseBdev", 00:21:44.033 "aliases": [ 00:21:44.033 "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a" 00:21:44.033 ], 00:21:44.033 "product_name": "Malloc disk", 00:21:44.033 "block_size": 512, 00:21:44.033 "num_blocks": 65536, 00:21:44.033 "uuid": "bc4f7507-3426-4b07-8a5d-5dbe8ec62c3a", 00:21:44.033 "assigned_rate_limits": { 00:21:44.033 "rw_ios_per_sec": 0, 00:21:44.033 "rw_mbytes_per_sec": 0, 00:21:44.033 "r_mbytes_per_sec": 0, 00:21:44.033 "w_mbytes_per_sec": 0 00:21:44.033 }, 00:21:44.033 "claimed": true, 00:21:44.033 "claim_type": "exclusive_write", 00:21:44.033 "zoned": false, 00:21:44.033 "supported_io_types": { 00:21:44.033 "read": true, 00:21:44.033 "write": true, 00:21:44.033 "unmap": true, 00:21:44.033 "flush": true, 00:21:44.033 "reset": true, 00:21:44.033 "nvme_admin": false, 00:21:44.033 "nvme_io": false, 00:21:44.033 "nvme_io_md": false, 00:21:44.033 "write_zeroes": true, 00:21:44.033 "zcopy": true, 00:21:44.033 "get_zone_info": false, 00:21:44.033 "zone_management": false, 00:21:44.033 "zone_append": false, 00:21:44.033 "compare": false, 00:21:44.033 "compare_and_write": false, 00:21:44.033 "abort": true, 00:21:44.033 "seek_hole": false, 00:21:44.033 "seek_data": false, 00:21:44.033 "copy": true, 00:21:44.033 "nvme_iov_md": false 00:21:44.033 }, 00:21:44.033 "memory_domains": [ 00:21:44.033 { 00:21:44.033 "dma_device_id": "system", 00:21:44.033 "dma_device_type": 1 00:21:44.033 }, 00:21:44.033 { 00:21:44.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.033 "dma_device_type": 2 00:21:44.033 } 00:21:44.033 ], 00:21:44.033 "driver_specific": {} 00:21:44.033 }' 00:21:44.033 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.033 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.033 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:44.033 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:44.033 00:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:44.292 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:44.292 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.292 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.292 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:44.292 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.292 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.292 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:44.292 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:44.292 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:44.292 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:44.551 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:44.551 "name": "BaseBdev2", 00:21:44.551 "aliases": [ 00:21:44.551 "f635ea19-768b-45cf-a9af-92076240f348" 00:21:44.551 ], 00:21:44.551 "product_name": "Malloc disk", 00:21:44.551 "block_size": 512, 00:21:44.551 "num_blocks": 65536, 00:21:44.551 "uuid": "f635ea19-768b-45cf-a9af-92076240f348", 00:21:44.551 "assigned_rate_limits": { 00:21:44.551 "rw_ios_per_sec": 0, 00:21:44.551 "rw_mbytes_per_sec": 0, 00:21:44.551 "r_mbytes_per_sec": 0, 00:21:44.551 "w_mbytes_per_sec": 0 00:21:44.552 }, 00:21:44.552 "claimed": true, 00:21:44.552 "claim_type": "exclusive_write", 00:21:44.552 "zoned": false, 00:21:44.552 "supported_io_types": { 00:21:44.552 "read": true, 00:21:44.552 "write": true, 00:21:44.552 "unmap": true, 00:21:44.552 "flush": true, 00:21:44.552 "reset": true, 00:21:44.552 "nvme_admin": false, 00:21:44.552 "nvme_io": false, 00:21:44.552 "nvme_io_md": false, 00:21:44.552 "write_zeroes": true, 00:21:44.552 "zcopy": true, 00:21:44.552 "get_zone_info": false, 00:21:44.552 "zone_management": false, 00:21:44.552 "zone_append": false, 00:21:44.552 "compare": false, 00:21:44.552 "compare_and_write": false, 00:21:44.552 "abort": true, 00:21:44.552 "seek_hole": false, 00:21:44.552 "seek_data": false, 00:21:44.552 "copy": true, 00:21:44.552 "nvme_iov_md": false 00:21:44.552 }, 00:21:44.552 "memory_domains": [ 00:21:44.552 { 00:21:44.552 "dma_device_id": "system", 00:21:44.552 "dma_device_type": 1 00:21:44.552 }, 00:21:44.552 { 00:21:44.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.552 "dma_device_type": 2 00:21:44.552 } 00:21:44.552 ], 00:21:44.552 "driver_specific": {} 00:21:44.552 }' 00:21:44.552 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.552 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.810 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:44.810 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:44.810 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:44.810 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:44.810 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.810 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.810 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:44.810 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.068 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.068 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:45.069 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:45.069 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:45.069 00:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:45.328 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:45.328 "name": "BaseBdev3", 00:21:45.328 "aliases": [ 00:21:45.328 "6b0f7747-9934-4d50-b380-f6cc950a2995" 00:21:45.328 ], 00:21:45.328 "product_name": "Malloc disk", 00:21:45.328 "block_size": 512, 00:21:45.328 "num_blocks": 65536, 00:21:45.328 "uuid": "6b0f7747-9934-4d50-b380-f6cc950a2995", 00:21:45.328 "assigned_rate_limits": { 00:21:45.328 "rw_ios_per_sec": 0, 00:21:45.328 "rw_mbytes_per_sec": 0, 00:21:45.328 "r_mbytes_per_sec": 0, 00:21:45.328 "w_mbytes_per_sec": 0 00:21:45.328 }, 00:21:45.328 "claimed": true, 00:21:45.328 "claim_type": "exclusive_write", 00:21:45.328 "zoned": false, 00:21:45.328 "supported_io_types": { 00:21:45.328 "read": true, 00:21:45.328 "write": true, 00:21:45.328 "unmap": true, 00:21:45.328 "flush": true, 00:21:45.328 "reset": true, 00:21:45.328 "nvme_admin": false, 00:21:45.328 "nvme_io": false, 00:21:45.328 "nvme_io_md": false, 00:21:45.328 "write_zeroes": true, 00:21:45.328 "zcopy": true, 00:21:45.328 "get_zone_info": false, 00:21:45.328 "zone_management": false, 00:21:45.328 "zone_append": false, 00:21:45.328 "compare": false, 00:21:45.328 "compare_and_write": false, 00:21:45.328 "abort": true, 00:21:45.328 "seek_hole": false, 00:21:45.328 "seek_data": false, 00:21:45.328 "copy": true, 00:21:45.328 "nvme_iov_md": false 00:21:45.328 }, 00:21:45.328 "memory_domains": [ 00:21:45.328 { 00:21:45.328 "dma_device_id": "system", 00:21:45.328 "dma_device_type": 1 00:21:45.328 }, 00:21:45.328 { 00:21:45.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.328 "dma_device_type": 2 00:21:45.328 } 00:21:45.328 ], 00:21:45.328 "driver_specific": {} 00:21:45.328 }' 00:21:45.328 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.328 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.328 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:45.328 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.328 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.328 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:45.328 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.587 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.587 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:45.587 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.587 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.587 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:45.587 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:45.587 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:45.587 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:45.845 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:45.845 "name": "BaseBdev4", 00:21:45.845 "aliases": [ 00:21:45.845 "60762769-15b3-4c21-85f3-b5f9f1ab945d" 00:21:45.845 ], 00:21:45.845 "product_name": "Malloc disk", 00:21:45.845 "block_size": 512, 00:21:45.845 "num_blocks": 65536, 00:21:45.845 "uuid": "60762769-15b3-4c21-85f3-b5f9f1ab945d", 00:21:45.845 "assigned_rate_limits": { 00:21:45.845 "rw_ios_per_sec": 0, 00:21:45.845 "rw_mbytes_per_sec": 0, 00:21:45.845 "r_mbytes_per_sec": 0, 00:21:45.845 "w_mbytes_per_sec": 0 00:21:45.845 }, 00:21:45.845 "claimed": true, 00:21:45.845 "claim_type": "exclusive_write", 00:21:45.845 "zoned": false, 00:21:45.845 "supported_io_types": { 00:21:45.845 "read": true, 00:21:45.845 "write": true, 00:21:45.845 "unmap": true, 00:21:45.845 "flush": true, 00:21:45.845 "reset": true, 00:21:45.845 "nvme_admin": false, 00:21:45.845 "nvme_io": false, 00:21:45.845 "nvme_io_md": false, 00:21:45.845 "write_zeroes": true, 00:21:45.845 "zcopy": true, 00:21:45.845 "get_zone_info": false, 00:21:45.845 "zone_management": false, 00:21:45.845 "zone_append": false, 00:21:45.845 "compare": false, 00:21:45.845 "compare_and_write": false, 00:21:45.845 "abort": true, 00:21:45.845 "seek_hole": false, 00:21:45.845 "seek_data": false, 00:21:45.845 "copy": true, 00:21:45.845 "nvme_iov_md": false 00:21:45.845 }, 00:21:45.845 "memory_domains": [ 00:21:45.845 { 00:21:45.845 "dma_device_id": "system", 00:21:45.845 "dma_device_type": 1 00:21:45.845 }, 00:21:45.845 { 00:21:45.845 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.845 "dma_device_type": 2 00:21:45.845 } 00:21:45.845 ], 00:21:45.845 "driver_specific": {} 00:21:45.845 }' 00:21:45.845 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.845 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.845 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:45.845 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.846 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.105 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:46.105 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.105 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.105 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:46.105 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.105 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.105 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:46.105 00:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:46.364 [2024-07-16 00:16:33.201786] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:46.364 [2024-07-16 00:16:33.201812] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:46.364 [2024-07-16 00:16:33.201861] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:46.364 [2024-07-16 00:16:33.202162] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:46.364 [2024-07-16 00:16:33.202174] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250b610 name Existed_Raid, state offline 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3581860 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3581860 ']' 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3581860 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3581860 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3581860' 00:21:46.364 killing process with pid 3581860 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3581860 00:21:46.364 [2024-07-16 00:16:33.268307] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:46.364 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3581860 00:21:46.364 [2024-07-16 00:16:33.304491] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:46.623 00:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:21:46.623 00:21:46.623 real 0m33.028s 00:21:46.623 user 1m0.957s 00:21:46.623 sys 0m6.080s 00:21:46.623 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:46.623 00:16:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:46.623 ************************************ 00:21:46.623 END TEST raid_state_function_test 00:21:46.623 ************************************ 00:21:46.623 00:16:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:46.623 00:16:33 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:21:46.623 00:16:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:46.623 00:16:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:46.623 00:16:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:46.882 ************************************ 00:21:46.882 START TEST raid_state_function_test_sb 00:21:46.882 ************************************ 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:46.882 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3586811 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3586811' 00:21:46.883 Process raid pid: 3586811 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3586811 /var/tmp/spdk-raid.sock 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3586811 ']' 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:46.883 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:46.883 00:16:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:46.883 [2024-07-16 00:16:33.662063] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:21:46.883 [2024-07-16 00:16:33.662131] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:46.883 [2024-07-16 00:16:33.793305] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:47.141 [2024-07-16 00:16:33.897907] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:47.141 [2024-07-16 00:16:33.961446] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:47.141 [2024-07-16 00:16:33.961475] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:47.707 00:16:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:47.707 00:16:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:47.707 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:47.966 [2024-07-16 00:16:34.832043] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:47.966 [2024-07-16 00:16:34.832086] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:47.966 [2024-07-16 00:16:34.832097] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:47.966 [2024-07-16 00:16:34.832109] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:47.966 [2024-07-16 00:16:34.832118] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:47.966 [2024-07-16 00:16:34.832129] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:47.966 [2024-07-16 00:16:34.832137] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:47.966 [2024-07-16 00:16:34.832148] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.966 00:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:48.225 00:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.225 "name": "Existed_Raid", 00:21:48.225 "uuid": "57feeb2f-4f73-4189-80bb-c7369e1bea04", 00:21:48.225 "strip_size_kb": 0, 00:21:48.225 "state": "configuring", 00:21:48.225 "raid_level": "raid1", 00:21:48.225 "superblock": true, 00:21:48.225 "num_base_bdevs": 4, 00:21:48.225 "num_base_bdevs_discovered": 0, 00:21:48.225 "num_base_bdevs_operational": 4, 00:21:48.225 "base_bdevs_list": [ 00:21:48.225 { 00:21:48.225 "name": "BaseBdev1", 00:21:48.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.225 "is_configured": false, 00:21:48.225 "data_offset": 0, 00:21:48.225 "data_size": 0 00:21:48.225 }, 00:21:48.225 { 00:21:48.225 "name": "BaseBdev2", 00:21:48.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.225 "is_configured": false, 00:21:48.225 "data_offset": 0, 00:21:48.225 "data_size": 0 00:21:48.225 }, 00:21:48.225 { 00:21:48.225 "name": "BaseBdev3", 00:21:48.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.225 "is_configured": false, 00:21:48.225 "data_offset": 0, 00:21:48.225 "data_size": 0 00:21:48.225 }, 00:21:48.225 { 00:21:48.225 "name": "BaseBdev4", 00:21:48.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.225 "is_configured": false, 00:21:48.225 "data_offset": 0, 00:21:48.225 "data_size": 0 00:21:48.225 } 00:21:48.225 ] 00:21:48.225 }' 00:21:48.225 00:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.225 00:16:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:48.793 00:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:49.052 [2024-07-16 00:16:35.914754] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:49.052 [2024-07-16 00:16:35.914786] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23b3aa0 name Existed_Raid, state configuring 00:21:49.052 00:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:49.311 [2024-07-16 00:16:36.159428] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:49.311 [2024-07-16 00:16:36.159461] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:49.311 [2024-07-16 00:16:36.159471] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:49.311 [2024-07-16 00:16:36.159482] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:49.311 [2024-07-16 00:16:36.159490] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:49.311 [2024-07-16 00:16:36.159501] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:49.311 [2024-07-16 00:16:36.159510] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:49.311 [2024-07-16 00:16:36.159521] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:49.311 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:49.569 [2024-07-16 00:16:36.414877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:49.569 BaseBdev1 00:21:49.570 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:49.570 00:16:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:49.570 00:16:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:49.570 00:16:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:49.570 00:16:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:49.570 00:16:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:49.570 00:16:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:49.828 00:16:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:50.087 [ 00:21:50.087 { 00:21:50.087 "name": "BaseBdev1", 00:21:50.087 "aliases": [ 00:21:50.087 "28ee48a1-c951-42c6-bdcc-3147001e1106" 00:21:50.087 ], 00:21:50.087 "product_name": "Malloc disk", 00:21:50.087 "block_size": 512, 00:21:50.087 "num_blocks": 65536, 00:21:50.087 "uuid": "28ee48a1-c951-42c6-bdcc-3147001e1106", 00:21:50.087 "assigned_rate_limits": { 00:21:50.087 "rw_ios_per_sec": 0, 00:21:50.087 "rw_mbytes_per_sec": 0, 00:21:50.087 "r_mbytes_per_sec": 0, 00:21:50.087 "w_mbytes_per_sec": 0 00:21:50.087 }, 00:21:50.087 "claimed": true, 00:21:50.087 "claim_type": "exclusive_write", 00:21:50.087 "zoned": false, 00:21:50.087 "supported_io_types": { 00:21:50.087 "read": true, 00:21:50.087 "write": true, 00:21:50.087 "unmap": true, 00:21:50.087 "flush": true, 00:21:50.087 "reset": true, 00:21:50.087 "nvme_admin": false, 00:21:50.087 "nvme_io": false, 00:21:50.087 "nvme_io_md": false, 00:21:50.087 "write_zeroes": true, 00:21:50.087 "zcopy": true, 00:21:50.087 "get_zone_info": false, 00:21:50.087 "zone_management": false, 00:21:50.087 "zone_append": false, 00:21:50.087 "compare": false, 00:21:50.087 "compare_and_write": false, 00:21:50.087 "abort": true, 00:21:50.087 "seek_hole": false, 00:21:50.087 "seek_data": false, 00:21:50.087 "copy": true, 00:21:50.087 "nvme_iov_md": false 00:21:50.087 }, 00:21:50.087 "memory_domains": [ 00:21:50.087 { 00:21:50.087 "dma_device_id": "system", 00:21:50.087 "dma_device_type": 1 00:21:50.087 }, 00:21:50.087 { 00:21:50.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:50.087 "dma_device_type": 2 00:21:50.087 } 00:21:50.087 ], 00:21:50.087 "driver_specific": {} 00:21:50.087 } 00:21:50.087 ] 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.087 00:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:50.370 00:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.370 "name": "Existed_Raid", 00:21:50.370 "uuid": "bdfec0e6-e249-4b67-a956-30db9d72db52", 00:21:50.370 "strip_size_kb": 0, 00:21:50.370 "state": "configuring", 00:21:50.370 "raid_level": "raid1", 00:21:50.370 "superblock": true, 00:21:50.370 "num_base_bdevs": 4, 00:21:50.370 "num_base_bdevs_discovered": 1, 00:21:50.370 "num_base_bdevs_operational": 4, 00:21:50.370 "base_bdevs_list": [ 00:21:50.370 { 00:21:50.370 "name": "BaseBdev1", 00:21:50.370 "uuid": "28ee48a1-c951-42c6-bdcc-3147001e1106", 00:21:50.370 "is_configured": true, 00:21:50.370 "data_offset": 2048, 00:21:50.370 "data_size": 63488 00:21:50.370 }, 00:21:50.370 { 00:21:50.370 "name": "BaseBdev2", 00:21:50.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.370 "is_configured": false, 00:21:50.370 "data_offset": 0, 00:21:50.370 "data_size": 0 00:21:50.370 }, 00:21:50.370 { 00:21:50.370 "name": "BaseBdev3", 00:21:50.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.370 "is_configured": false, 00:21:50.370 "data_offset": 0, 00:21:50.370 "data_size": 0 00:21:50.370 }, 00:21:50.370 { 00:21:50.370 "name": "BaseBdev4", 00:21:50.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.370 "is_configured": false, 00:21:50.370 "data_offset": 0, 00:21:50.370 "data_size": 0 00:21:50.370 } 00:21:50.370 ] 00:21:50.370 }' 00:21:50.370 00:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.370 00:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:50.935 00:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:51.194 [2024-07-16 00:16:37.987040] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:51.194 [2024-07-16 00:16:37.987077] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23b3310 name Existed_Raid, state configuring 00:21:51.194 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:51.451 [2024-07-16 00:16:38.231728] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:51.451 [2024-07-16 00:16:38.233212] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:51.451 [2024-07-16 00:16:38.233256] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:51.451 [2024-07-16 00:16:38.233267] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:51.451 [2024-07-16 00:16:38.233279] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:51.451 [2024-07-16 00:16:38.233288] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:51.451 [2024-07-16 00:16:38.233299] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:51.451 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:51.451 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:51.451 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:51.451 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:51.451 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:51.451 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:51.452 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:51.452 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:51.452 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:51.452 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:51.452 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:51.452 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:51.452 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.452 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:51.708 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:51.708 "name": "Existed_Raid", 00:21:51.708 "uuid": "48b380ab-3c47-4763-a13c-189eee656851", 00:21:51.708 "strip_size_kb": 0, 00:21:51.708 "state": "configuring", 00:21:51.708 "raid_level": "raid1", 00:21:51.708 "superblock": true, 00:21:51.708 "num_base_bdevs": 4, 00:21:51.708 "num_base_bdevs_discovered": 1, 00:21:51.708 "num_base_bdevs_operational": 4, 00:21:51.708 "base_bdevs_list": [ 00:21:51.708 { 00:21:51.708 "name": "BaseBdev1", 00:21:51.708 "uuid": "28ee48a1-c951-42c6-bdcc-3147001e1106", 00:21:51.708 "is_configured": true, 00:21:51.708 "data_offset": 2048, 00:21:51.708 "data_size": 63488 00:21:51.708 }, 00:21:51.708 { 00:21:51.708 "name": "BaseBdev2", 00:21:51.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.708 "is_configured": false, 00:21:51.708 "data_offset": 0, 00:21:51.708 "data_size": 0 00:21:51.708 }, 00:21:51.708 { 00:21:51.708 "name": "BaseBdev3", 00:21:51.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.708 "is_configured": false, 00:21:51.708 "data_offset": 0, 00:21:51.708 "data_size": 0 00:21:51.708 }, 00:21:51.708 { 00:21:51.708 "name": "BaseBdev4", 00:21:51.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.708 "is_configured": false, 00:21:51.708 "data_offset": 0, 00:21:51.708 "data_size": 0 00:21:51.708 } 00:21:51.708 ] 00:21:51.708 }' 00:21:51.708 00:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:51.708 00:16:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:52.285 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:52.542 [2024-07-16 00:16:39.342035] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:52.542 BaseBdev2 00:21:52.542 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:52.542 00:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:52.542 00:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:52.542 00:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:52.542 00:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:52.542 00:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:52.542 00:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:52.800 00:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:53.059 [ 00:21:53.059 { 00:21:53.059 "name": "BaseBdev2", 00:21:53.059 "aliases": [ 00:21:53.059 "059223b1-5cef-408a-bd0d-df00ae2a0192" 00:21:53.059 ], 00:21:53.059 "product_name": "Malloc disk", 00:21:53.059 "block_size": 512, 00:21:53.059 "num_blocks": 65536, 00:21:53.059 "uuid": "059223b1-5cef-408a-bd0d-df00ae2a0192", 00:21:53.059 "assigned_rate_limits": { 00:21:53.059 "rw_ios_per_sec": 0, 00:21:53.059 "rw_mbytes_per_sec": 0, 00:21:53.059 "r_mbytes_per_sec": 0, 00:21:53.059 "w_mbytes_per_sec": 0 00:21:53.059 }, 00:21:53.059 "claimed": true, 00:21:53.059 "claim_type": "exclusive_write", 00:21:53.059 "zoned": false, 00:21:53.059 "supported_io_types": { 00:21:53.059 "read": true, 00:21:53.059 "write": true, 00:21:53.059 "unmap": true, 00:21:53.059 "flush": true, 00:21:53.059 "reset": true, 00:21:53.059 "nvme_admin": false, 00:21:53.059 "nvme_io": false, 00:21:53.059 "nvme_io_md": false, 00:21:53.059 "write_zeroes": true, 00:21:53.059 "zcopy": true, 00:21:53.059 "get_zone_info": false, 00:21:53.059 "zone_management": false, 00:21:53.059 "zone_append": false, 00:21:53.059 "compare": false, 00:21:53.059 "compare_and_write": false, 00:21:53.059 "abort": true, 00:21:53.059 "seek_hole": false, 00:21:53.059 "seek_data": false, 00:21:53.059 "copy": true, 00:21:53.059 "nvme_iov_md": false 00:21:53.059 }, 00:21:53.059 "memory_domains": [ 00:21:53.059 { 00:21:53.059 "dma_device_id": "system", 00:21:53.059 "dma_device_type": 1 00:21:53.059 }, 00:21:53.059 { 00:21:53.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.059 "dma_device_type": 2 00:21:53.059 } 00:21:53.059 ], 00:21:53.059 "driver_specific": {} 00:21:53.059 } 00:21:53.059 ] 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.059 00:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:53.317 00:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:53.317 "name": "Existed_Raid", 00:21:53.317 "uuid": "48b380ab-3c47-4763-a13c-189eee656851", 00:21:53.317 "strip_size_kb": 0, 00:21:53.317 "state": "configuring", 00:21:53.317 "raid_level": "raid1", 00:21:53.317 "superblock": true, 00:21:53.317 "num_base_bdevs": 4, 00:21:53.317 "num_base_bdevs_discovered": 2, 00:21:53.317 "num_base_bdevs_operational": 4, 00:21:53.317 "base_bdevs_list": [ 00:21:53.317 { 00:21:53.317 "name": "BaseBdev1", 00:21:53.317 "uuid": "28ee48a1-c951-42c6-bdcc-3147001e1106", 00:21:53.317 "is_configured": true, 00:21:53.317 "data_offset": 2048, 00:21:53.317 "data_size": 63488 00:21:53.317 }, 00:21:53.317 { 00:21:53.317 "name": "BaseBdev2", 00:21:53.317 "uuid": "059223b1-5cef-408a-bd0d-df00ae2a0192", 00:21:53.317 "is_configured": true, 00:21:53.317 "data_offset": 2048, 00:21:53.317 "data_size": 63488 00:21:53.317 }, 00:21:53.317 { 00:21:53.317 "name": "BaseBdev3", 00:21:53.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:53.317 "is_configured": false, 00:21:53.317 "data_offset": 0, 00:21:53.317 "data_size": 0 00:21:53.317 }, 00:21:53.317 { 00:21:53.317 "name": "BaseBdev4", 00:21:53.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:53.317 "is_configured": false, 00:21:53.317 "data_offset": 0, 00:21:53.317 "data_size": 0 00:21:53.317 } 00:21:53.317 ] 00:21:53.317 }' 00:21:53.317 00:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:53.317 00:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:53.881 00:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:54.140 [2024-07-16 00:16:41.009867] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:54.140 BaseBdev3 00:21:54.140 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:54.140 00:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:54.140 00:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:54.140 00:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:54.140 00:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:54.140 00:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:54.140 00:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:54.415 00:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:54.687 [ 00:21:54.687 { 00:21:54.687 "name": "BaseBdev3", 00:21:54.687 "aliases": [ 00:21:54.687 "d8aac119-6317-4103-b9ac-3b91801fff6c" 00:21:54.687 ], 00:21:54.687 "product_name": "Malloc disk", 00:21:54.687 "block_size": 512, 00:21:54.687 "num_blocks": 65536, 00:21:54.687 "uuid": "d8aac119-6317-4103-b9ac-3b91801fff6c", 00:21:54.687 "assigned_rate_limits": { 00:21:54.687 "rw_ios_per_sec": 0, 00:21:54.687 "rw_mbytes_per_sec": 0, 00:21:54.687 "r_mbytes_per_sec": 0, 00:21:54.687 "w_mbytes_per_sec": 0 00:21:54.687 }, 00:21:54.687 "claimed": true, 00:21:54.687 "claim_type": "exclusive_write", 00:21:54.687 "zoned": false, 00:21:54.687 "supported_io_types": { 00:21:54.687 "read": true, 00:21:54.687 "write": true, 00:21:54.687 "unmap": true, 00:21:54.687 "flush": true, 00:21:54.687 "reset": true, 00:21:54.687 "nvme_admin": false, 00:21:54.687 "nvme_io": false, 00:21:54.687 "nvme_io_md": false, 00:21:54.687 "write_zeroes": true, 00:21:54.687 "zcopy": true, 00:21:54.687 "get_zone_info": false, 00:21:54.687 "zone_management": false, 00:21:54.687 "zone_append": false, 00:21:54.687 "compare": false, 00:21:54.687 "compare_and_write": false, 00:21:54.687 "abort": true, 00:21:54.687 "seek_hole": false, 00:21:54.687 "seek_data": false, 00:21:54.687 "copy": true, 00:21:54.687 "nvme_iov_md": false 00:21:54.687 }, 00:21:54.687 "memory_domains": [ 00:21:54.687 { 00:21:54.687 "dma_device_id": "system", 00:21:54.687 "dma_device_type": 1 00:21:54.687 }, 00:21:54.687 { 00:21:54.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.687 "dma_device_type": 2 00:21:54.687 } 00:21:54.687 ], 00:21:54.687 "driver_specific": {} 00:21:54.687 } 00:21:54.687 ] 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.688 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:54.947 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.947 "name": "Existed_Raid", 00:21:54.947 "uuid": "48b380ab-3c47-4763-a13c-189eee656851", 00:21:54.947 "strip_size_kb": 0, 00:21:54.947 "state": "configuring", 00:21:54.947 "raid_level": "raid1", 00:21:54.947 "superblock": true, 00:21:54.947 "num_base_bdevs": 4, 00:21:54.947 "num_base_bdevs_discovered": 3, 00:21:54.947 "num_base_bdevs_operational": 4, 00:21:54.947 "base_bdevs_list": [ 00:21:54.947 { 00:21:54.947 "name": "BaseBdev1", 00:21:54.947 "uuid": "28ee48a1-c951-42c6-bdcc-3147001e1106", 00:21:54.947 "is_configured": true, 00:21:54.947 "data_offset": 2048, 00:21:54.947 "data_size": 63488 00:21:54.947 }, 00:21:54.947 { 00:21:54.947 "name": "BaseBdev2", 00:21:54.947 "uuid": "059223b1-5cef-408a-bd0d-df00ae2a0192", 00:21:54.947 "is_configured": true, 00:21:54.947 "data_offset": 2048, 00:21:54.947 "data_size": 63488 00:21:54.947 }, 00:21:54.947 { 00:21:54.947 "name": "BaseBdev3", 00:21:54.947 "uuid": "d8aac119-6317-4103-b9ac-3b91801fff6c", 00:21:54.947 "is_configured": true, 00:21:54.947 "data_offset": 2048, 00:21:54.947 "data_size": 63488 00:21:54.947 }, 00:21:54.947 { 00:21:54.947 "name": "BaseBdev4", 00:21:54.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.947 "is_configured": false, 00:21:54.947 "data_offset": 0, 00:21:54.947 "data_size": 0 00:21:54.947 } 00:21:54.947 ] 00:21:54.947 }' 00:21:54.947 00:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.947 00:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:55.515 00:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:55.774 [2024-07-16 00:16:42.609481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:55.774 [2024-07-16 00:16:42.609652] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23b4350 00:21:55.774 [2024-07-16 00:16:42.609666] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:55.774 [2024-07-16 00:16:42.609840] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23b4020 00:21:55.774 [2024-07-16 00:16:42.609983] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23b4350 00:21:55.774 [2024-07-16 00:16:42.609994] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23b4350 00:21:55.774 [2024-07-16 00:16:42.610090] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:55.774 BaseBdev4 00:21:55.774 00:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:55.774 00:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:55.774 00:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:55.774 00:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:55.774 00:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:55.774 00:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:55.774 00:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:56.033 00:16:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:56.292 [ 00:21:56.292 { 00:21:56.292 "name": "BaseBdev4", 00:21:56.292 "aliases": [ 00:21:56.292 "189054ad-4ef2-46b5-a882-f07703b3a9de" 00:21:56.292 ], 00:21:56.292 "product_name": "Malloc disk", 00:21:56.292 "block_size": 512, 00:21:56.292 "num_blocks": 65536, 00:21:56.292 "uuid": "189054ad-4ef2-46b5-a882-f07703b3a9de", 00:21:56.292 "assigned_rate_limits": { 00:21:56.292 "rw_ios_per_sec": 0, 00:21:56.292 "rw_mbytes_per_sec": 0, 00:21:56.292 "r_mbytes_per_sec": 0, 00:21:56.292 "w_mbytes_per_sec": 0 00:21:56.292 }, 00:21:56.292 "claimed": true, 00:21:56.292 "claim_type": "exclusive_write", 00:21:56.292 "zoned": false, 00:21:56.292 "supported_io_types": { 00:21:56.292 "read": true, 00:21:56.292 "write": true, 00:21:56.292 "unmap": true, 00:21:56.292 "flush": true, 00:21:56.292 "reset": true, 00:21:56.292 "nvme_admin": false, 00:21:56.292 "nvme_io": false, 00:21:56.292 "nvme_io_md": false, 00:21:56.292 "write_zeroes": true, 00:21:56.292 "zcopy": true, 00:21:56.292 "get_zone_info": false, 00:21:56.292 "zone_management": false, 00:21:56.292 "zone_append": false, 00:21:56.292 "compare": false, 00:21:56.292 "compare_and_write": false, 00:21:56.292 "abort": true, 00:21:56.292 "seek_hole": false, 00:21:56.292 "seek_data": false, 00:21:56.292 "copy": true, 00:21:56.292 "nvme_iov_md": false 00:21:56.292 }, 00:21:56.292 "memory_domains": [ 00:21:56.292 { 00:21:56.292 "dma_device_id": "system", 00:21:56.292 "dma_device_type": 1 00:21:56.293 }, 00:21:56.293 { 00:21:56.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:56.293 "dma_device_type": 2 00:21:56.293 } 00:21:56.293 ], 00:21:56.293 "driver_specific": {} 00:21:56.293 } 00:21:56.293 ] 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.293 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:56.551 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:56.551 "name": "Existed_Raid", 00:21:56.551 "uuid": "48b380ab-3c47-4763-a13c-189eee656851", 00:21:56.551 "strip_size_kb": 0, 00:21:56.551 "state": "online", 00:21:56.551 "raid_level": "raid1", 00:21:56.551 "superblock": true, 00:21:56.551 "num_base_bdevs": 4, 00:21:56.551 "num_base_bdevs_discovered": 4, 00:21:56.551 "num_base_bdevs_operational": 4, 00:21:56.551 "base_bdevs_list": [ 00:21:56.551 { 00:21:56.551 "name": "BaseBdev1", 00:21:56.551 "uuid": "28ee48a1-c951-42c6-bdcc-3147001e1106", 00:21:56.551 "is_configured": true, 00:21:56.551 "data_offset": 2048, 00:21:56.551 "data_size": 63488 00:21:56.551 }, 00:21:56.551 { 00:21:56.551 "name": "BaseBdev2", 00:21:56.551 "uuid": "059223b1-5cef-408a-bd0d-df00ae2a0192", 00:21:56.551 "is_configured": true, 00:21:56.551 "data_offset": 2048, 00:21:56.551 "data_size": 63488 00:21:56.551 }, 00:21:56.551 { 00:21:56.551 "name": "BaseBdev3", 00:21:56.551 "uuid": "d8aac119-6317-4103-b9ac-3b91801fff6c", 00:21:56.551 "is_configured": true, 00:21:56.551 "data_offset": 2048, 00:21:56.551 "data_size": 63488 00:21:56.551 }, 00:21:56.551 { 00:21:56.551 "name": "BaseBdev4", 00:21:56.551 "uuid": "189054ad-4ef2-46b5-a882-f07703b3a9de", 00:21:56.551 "is_configured": true, 00:21:56.551 "data_offset": 2048, 00:21:56.551 "data_size": 63488 00:21:56.551 } 00:21:56.551 ] 00:21:56.551 }' 00:21:56.551 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:56.551 00:16:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:57.131 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:57.131 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:57.131 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:57.131 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:57.131 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:57.131 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:57.131 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:57.131 00:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:57.391 [2024-07-16 00:16:44.173962] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:57.391 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:57.391 "name": "Existed_Raid", 00:21:57.391 "aliases": [ 00:21:57.391 "48b380ab-3c47-4763-a13c-189eee656851" 00:21:57.391 ], 00:21:57.391 "product_name": "Raid Volume", 00:21:57.391 "block_size": 512, 00:21:57.391 "num_blocks": 63488, 00:21:57.391 "uuid": "48b380ab-3c47-4763-a13c-189eee656851", 00:21:57.391 "assigned_rate_limits": { 00:21:57.391 "rw_ios_per_sec": 0, 00:21:57.391 "rw_mbytes_per_sec": 0, 00:21:57.391 "r_mbytes_per_sec": 0, 00:21:57.391 "w_mbytes_per_sec": 0 00:21:57.391 }, 00:21:57.391 "claimed": false, 00:21:57.391 "zoned": false, 00:21:57.391 "supported_io_types": { 00:21:57.391 "read": true, 00:21:57.391 "write": true, 00:21:57.391 "unmap": false, 00:21:57.391 "flush": false, 00:21:57.391 "reset": true, 00:21:57.391 "nvme_admin": false, 00:21:57.391 "nvme_io": false, 00:21:57.391 "nvme_io_md": false, 00:21:57.391 "write_zeroes": true, 00:21:57.391 "zcopy": false, 00:21:57.391 "get_zone_info": false, 00:21:57.391 "zone_management": false, 00:21:57.391 "zone_append": false, 00:21:57.391 "compare": false, 00:21:57.391 "compare_and_write": false, 00:21:57.391 "abort": false, 00:21:57.391 "seek_hole": false, 00:21:57.391 "seek_data": false, 00:21:57.391 "copy": false, 00:21:57.391 "nvme_iov_md": false 00:21:57.391 }, 00:21:57.391 "memory_domains": [ 00:21:57.391 { 00:21:57.391 "dma_device_id": "system", 00:21:57.391 "dma_device_type": 1 00:21:57.391 }, 00:21:57.391 { 00:21:57.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:57.391 "dma_device_type": 2 00:21:57.391 }, 00:21:57.391 { 00:21:57.391 "dma_device_id": "system", 00:21:57.391 "dma_device_type": 1 00:21:57.391 }, 00:21:57.391 { 00:21:57.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:57.391 "dma_device_type": 2 00:21:57.391 }, 00:21:57.391 { 00:21:57.391 "dma_device_id": "system", 00:21:57.391 "dma_device_type": 1 00:21:57.391 }, 00:21:57.391 { 00:21:57.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:57.391 "dma_device_type": 2 00:21:57.391 }, 00:21:57.391 { 00:21:57.391 "dma_device_id": "system", 00:21:57.391 "dma_device_type": 1 00:21:57.391 }, 00:21:57.391 { 00:21:57.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:57.391 "dma_device_type": 2 00:21:57.391 } 00:21:57.391 ], 00:21:57.391 "driver_specific": { 00:21:57.391 "raid": { 00:21:57.391 "uuid": "48b380ab-3c47-4763-a13c-189eee656851", 00:21:57.391 "strip_size_kb": 0, 00:21:57.391 "state": "online", 00:21:57.391 "raid_level": "raid1", 00:21:57.391 "superblock": true, 00:21:57.391 "num_base_bdevs": 4, 00:21:57.391 "num_base_bdevs_discovered": 4, 00:21:57.391 "num_base_bdevs_operational": 4, 00:21:57.391 "base_bdevs_list": [ 00:21:57.391 { 00:21:57.391 "name": "BaseBdev1", 00:21:57.391 "uuid": "28ee48a1-c951-42c6-bdcc-3147001e1106", 00:21:57.391 "is_configured": true, 00:21:57.391 "data_offset": 2048, 00:21:57.391 "data_size": 63488 00:21:57.391 }, 00:21:57.391 { 00:21:57.391 "name": "BaseBdev2", 00:21:57.391 "uuid": "059223b1-5cef-408a-bd0d-df00ae2a0192", 00:21:57.391 "is_configured": true, 00:21:57.391 "data_offset": 2048, 00:21:57.391 "data_size": 63488 00:21:57.391 }, 00:21:57.391 { 00:21:57.391 "name": "BaseBdev3", 00:21:57.391 "uuid": "d8aac119-6317-4103-b9ac-3b91801fff6c", 00:21:57.391 "is_configured": true, 00:21:57.391 "data_offset": 2048, 00:21:57.391 "data_size": 63488 00:21:57.391 }, 00:21:57.391 { 00:21:57.391 "name": "BaseBdev4", 00:21:57.391 "uuid": "189054ad-4ef2-46b5-a882-f07703b3a9de", 00:21:57.391 "is_configured": true, 00:21:57.391 "data_offset": 2048, 00:21:57.391 "data_size": 63488 00:21:57.391 } 00:21:57.391 ] 00:21:57.391 } 00:21:57.391 } 00:21:57.391 }' 00:21:57.391 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:57.391 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:57.391 BaseBdev2 00:21:57.391 BaseBdev3 00:21:57.391 BaseBdev4' 00:21:57.391 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:57.391 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:57.391 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:57.650 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:57.650 "name": "BaseBdev1", 00:21:57.650 "aliases": [ 00:21:57.650 "28ee48a1-c951-42c6-bdcc-3147001e1106" 00:21:57.650 ], 00:21:57.650 "product_name": "Malloc disk", 00:21:57.650 "block_size": 512, 00:21:57.650 "num_blocks": 65536, 00:21:57.650 "uuid": "28ee48a1-c951-42c6-bdcc-3147001e1106", 00:21:57.650 "assigned_rate_limits": { 00:21:57.650 "rw_ios_per_sec": 0, 00:21:57.650 "rw_mbytes_per_sec": 0, 00:21:57.650 "r_mbytes_per_sec": 0, 00:21:57.650 "w_mbytes_per_sec": 0 00:21:57.650 }, 00:21:57.650 "claimed": true, 00:21:57.650 "claim_type": "exclusive_write", 00:21:57.650 "zoned": false, 00:21:57.650 "supported_io_types": { 00:21:57.650 "read": true, 00:21:57.650 "write": true, 00:21:57.650 "unmap": true, 00:21:57.650 "flush": true, 00:21:57.650 "reset": true, 00:21:57.650 "nvme_admin": false, 00:21:57.650 "nvme_io": false, 00:21:57.650 "nvme_io_md": false, 00:21:57.650 "write_zeroes": true, 00:21:57.650 "zcopy": true, 00:21:57.650 "get_zone_info": false, 00:21:57.650 "zone_management": false, 00:21:57.650 "zone_append": false, 00:21:57.650 "compare": false, 00:21:57.650 "compare_and_write": false, 00:21:57.650 "abort": true, 00:21:57.650 "seek_hole": false, 00:21:57.650 "seek_data": false, 00:21:57.650 "copy": true, 00:21:57.650 "nvme_iov_md": false 00:21:57.650 }, 00:21:57.650 "memory_domains": [ 00:21:57.650 { 00:21:57.650 "dma_device_id": "system", 00:21:57.650 "dma_device_type": 1 00:21:57.650 }, 00:21:57.650 { 00:21:57.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:57.650 "dma_device_type": 2 00:21:57.650 } 00:21:57.650 ], 00:21:57.650 "driver_specific": {} 00:21:57.650 }' 00:21:57.650 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:57.650 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:57.650 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:57.650 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:57.909 00:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:58.167 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:58.167 "name": "BaseBdev2", 00:21:58.167 "aliases": [ 00:21:58.167 "059223b1-5cef-408a-bd0d-df00ae2a0192" 00:21:58.167 ], 00:21:58.167 "product_name": "Malloc disk", 00:21:58.167 "block_size": 512, 00:21:58.167 "num_blocks": 65536, 00:21:58.167 "uuid": "059223b1-5cef-408a-bd0d-df00ae2a0192", 00:21:58.167 "assigned_rate_limits": { 00:21:58.167 "rw_ios_per_sec": 0, 00:21:58.167 "rw_mbytes_per_sec": 0, 00:21:58.167 "r_mbytes_per_sec": 0, 00:21:58.167 "w_mbytes_per_sec": 0 00:21:58.167 }, 00:21:58.167 "claimed": true, 00:21:58.167 "claim_type": "exclusive_write", 00:21:58.167 "zoned": false, 00:21:58.167 "supported_io_types": { 00:21:58.167 "read": true, 00:21:58.167 "write": true, 00:21:58.167 "unmap": true, 00:21:58.167 "flush": true, 00:21:58.167 "reset": true, 00:21:58.167 "nvme_admin": false, 00:21:58.167 "nvme_io": false, 00:21:58.167 "nvme_io_md": false, 00:21:58.167 "write_zeroes": true, 00:21:58.167 "zcopy": true, 00:21:58.167 "get_zone_info": false, 00:21:58.167 "zone_management": false, 00:21:58.167 "zone_append": false, 00:21:58.167 "compare": false, 00:21:58.167 "compare_and_write": false, 00:21:58.167 "abort": true, 00:21:58.167 "seek_hole": false, 00:21:58.167 "seek_data": false, 00:21:58.167 "copy": true, 00:21:58.167 "nvme_iov_md": false 00:21:58.167 }, 00:21:58.167 "memory_domains": [ 00:21:58.167 { 00:21:58.167 "dma_device_id": "system", 00:21:58.167 "dma_device_type": 1 00:21:58.167 }, 00:21:58.167 { 00:21:58.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.167 "dma_device_type": 2 00:21:58.167 } 00:21:58.167 ], 00:21:58.167 "driver_specific": {} 00:21:58.167 }' 00:21:58.167 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:58.425 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:58.425 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:58.425 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:58.425 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:58.425 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:58.425 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:58.425 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:58.425 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:58.425 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:58.683 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:58.683 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:58.683 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:58.683 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:58.683 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:58.943 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:58.943 "name": "BaseBdev3", 00:21:58.943 "aliases": [ 00:21:58.943 "d8aac119-6317-4103-b9ac-3b91801fff6c" 00:21:58.943 ], 00:21:58.943 "product_name": "Malloc disk", 00:21:58.943 "block_size": 512, 00:21:58.943 "num_blocks": 65536, 00:21:58.943 "uuid": "d8aac119-6317-4103-b9ac-3b91801fff6c", 00:21:58.943 "assigned_rate_limits": { 00:21:58.943 "rw_ios_per_sec": 0, 00:21:58.943 "rw_mbytes_per_sec": 0, 00:21:58.943 "r_mbytes_per_sec": 0, 00:21:58.943 "w_mbytes_per_sec": 0 00:21:58.943 }, 00:21:58.943 "claimed": true, 00:21:58.943 "claim_type": "exclusive_write", 00:21:58.943 "zoned": false, 00:21:58.943 "supported_io_types": { 00:21:58.943 "read": true, 00:21:58.943 "write": true, 00:21:58.943 "unmap": true, 00:21:58.943 "flush": true, 00:21:58.943 "reset": true, 00:21:58.943 "nvme_admin": false, 00:21:58.943 "nvme_io": false, 00:21:58.943 "nvme_io_md": false, 00:21:58.943 "write_zeroes": true, 00:21:58.943 "zcopy": true, 00:21:58.943 "get_zone_info": false, 00:21:58.943 "zone_management": false, 00:21:58.943 "zone_append": false, 00:21:58.943 "compare": false, 00:21:58.943 "compare_and_write": false, 00:21:58.943 "abort": true, 00:21:58.943 "seek_hole": false, 00:21:58.943 "seek_data": false, 00:21:58.943 "copy": true, 00:21:58.943 "nvme_iov_md": false 00:21:58.943 }, 00:21:58.943 "memory_domains": [ 00:21:58.943 { 00:21:58.943 "dma_device_id": "system", 00:21:58.943 "dma_device_type": 1 00:21:58.943 }, 00:21:58.943 { 00:21:58.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.943 "dma_device_type": 2 00:21:58.943 } 00:21:58.943 ], 00:21:58.943 "driver_specific": {} 00:21:58.943 }' 00:21:58.943 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:58.943 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:58.943 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:58.943 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:58.943 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:58.943 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:58.943 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:59.202 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:59.202 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:59.202 00:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:59.202 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:59.202 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:59.202 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:59.202 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:59.202 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:59.461 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:59.461 "name": "BaseBdev4", 00:21:59.461 "aliases": [ 00:21:59.461 "189054ad-4ef2-46b5-a882-f07703b3a9de" 00:21:59.461 ], 00:21:59.461 "product_name": "Malloc disk", 00:21:59.461 "block_size": 512, 00:21:59.461 "num_blocks": 65536, 00:21:59.461 "uuid": "189054ad-4ef2-46b5-a882-f07703b3a9de", 00:21:59.461 "assigned_rate_limits": { 00:21:59.461 "rw_ios_per_sec": 0, 00:21:59.461 "rw_mbytes_per_sec": 0, 00:21:59.461 "r_mbytes_per_sec": 0, 00:21:59.461 "w_mbytes_per_sec": 0 00:21:59.461 }, 00:21:59.461 "claimed": true, 00:21:59.461 "claim_type": "exclusive_write", 00:21:59.461 "zoned": false, 00:21:59.461 "supported_io_types": { 00:21:59.461 "read": true, 00:21:59.461 "write": true, 00:21:59.461 "unmap": true, 00:21:59.461 "flush": true, 00:21:59.461 "reset": true, 00:21:59.461 "nvme_admin": false, 00:21:59.461 "nvme_io": false, 00:21:59.461 "nvme_io_md": false, 00:21:59.461 "write_zeroes": true, 00:21:59.461 "zcopy": true, 00:21:59.461 "get_zone_info": false, 00:21:59.461 "zone_management": false, 00:21:59.461 "zone_append": false, 00:21:59.461 "compare": false, 00:21:59.461 "compare_and_write": false, 00:21:59.461 "abort": true, 00:21:59.461 "seek_hole": false, 00:21:59.461 "seek_data": false, 00:21:59.461 "copy": true, 00:21:59.461 "nvme_iov_md": false 00:21:59.461 }, 00:21:59.461 "memory_domains": [ 00:21:59.461 { 00:21:59.461 "dma_device_id": "system", 00:21:59.461 "dma_device_type": 1 00:21:59.461 }, 00:21:59.461 { 00:21:59.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:59.461 "dma_device_type": 2 00:21:59.461 } 00:21:59.461 ], 00:21:59.461 "driver_specific": {} 00:21:59.461 }' 00:21:59.461 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:59.461 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:59.461 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:59.461 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:59.720 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:59.720 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:59.720 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:59.720 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:59.720 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:59.720 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:59.720 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:59.720 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:59.720 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:59.978 [2024-07-16 00:16:46.884891] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.978 00:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:00.237 00:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.237 "name": "Existed_Raid", 00:22:00.237 "uuid": "48b380ab-3c47-4763-a13c-189eee656851", 00:22:00.237 "strip_size_kb": 0, 00:22:00.237 "state": "online", 00:22:00.237 "raid_level": "raid1", 00:22:00.237 "superblock": true, 00:22:00.237 "num_base_bdevs": 4, 00:22:00.237 "num_base_bdevs_discovered": 3, 00:22:00.237 "num_base_bdevs_operational": 3, 00:22:00.237 "base_bdevs_list": [ 00:22:00.237 { 00:22:00.237 "name": null, 00:22:00.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.238 "is_configured": false, 00:22:00.238 "data_offset": 2048, 00:22:00.238 "data_size": 63488 00:22:00.238 }, 00:22:00.238 { 00:22:00.238 "name": "BaseBdev2", 00:22:00.238 "uuid": "059223b1-5cef-408a-bd0d-df00ae2a0192", 00:22:00.238 "is_configured": true, 00:22:00.238 "data_offset": 2048, 00:22:00.238 "data_size": 63488 00:22:00.238 }, 00:22:00.238 { 00:22:00.238 "name": "BaseBdev3", 00:22:00.238 "uuid": "d8aac119-6317-4103-b9ac-3b91801fff6c", 00:22:00.238 "is_configured": true, 00:22:00.238 "data_offset": 2048, 00:22:00.238 "data_size": 63488 00:22:00.238 }, 00:22:00.238 { 00:22:00.238 "name": "BaseBdev4", 00:22:00.238 "uuid": "189054ad-4ef2-46b5-a882-f07703b3a9de", 00:22:00.238 "is_configured": true, 00:22:00.238 "data_offset": 2048, 00:22:00.238 "data_size": 63488 00:22:00.238 } 00:22:00.238 ] 00:22:00.238 }' 00:22:00.238 00:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.238 00:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:01.173 00:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:01.173 00:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:01.173 00:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.173 00:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:01.173 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:01.173 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:01.173 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:01.431 [2024-07-16 00:16:48.237604] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:01.431 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:01.431 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:01.431 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.431 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:01.787 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:01.787 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:01.787 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:02.046 [2024-07-16 00:16:48.747526] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:02.046 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:02.046 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:02.046 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.046 00:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:02.306 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:02.306 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:02.306 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:02.306 [2024-07-16 00:16:49.248730] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:02.306 [2024-07-16 00:16:49.248809] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:02.565 [2024-07-16 00:16:49.259596] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:02.565 [2024-07-16 00:16:49.259627] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:02.565 [2024-07-16 00:16:49.259639] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23b4350 name Existed_Raid, state offline 00:22:02.565 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:02.565 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:02.565 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.565 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:02.824 BaseBdev2 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:02.824 00:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:03.082 00:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:03.341 [ 00:22:03.341 { 00:22:03.341 "name": "BaseBdev2", 00:22:03.341 "aliases": [ 00:22:03.341 "895e848c-fce0-4331-a9ab-108f58d83952" 00:22:03.341 ], 00:22:03.341 "product_name": "Malloc disk", 00:22:03.341 "block_size": 512, 00:22:03.341 "num_blocks": 65536, 00:22:03.341 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:03.341 "assigned_rate_limits": { 00:22:03.341 "rw_ios_per_sec": 0, 00:22:03.341 "rw_mbytes_per_sec": 0, 00:22:03.341 "r_mbytes_per_sec": 0, 00:22:03.341 "w_mbytes_per_sec": 0 00:22:03.341 }, 00:22:03.341 "claimed": false, 00:22:03.341 "zoned": false, 00:22:03.341 "supported_io_types": { 00:22:03.341 "read": true, 00:22:03.341 "write": true, 00:22:03.341 "unmap": true, 00:22:03.341 "flush": true, 00:22:03.341 "reset": true, 00:22:03.341 "nvme_admin": false, 00:22:03.341 "nvme_io": false, 00:22:03.341 "nvme_io_md": false, 00:22:03.341 "write_zeroes": true, 00:22:03.341 "zcopy": true, 00:22:03.341 "get_zone_info": false, 00:22:03.341 "zone_management": false, 00:22:03.341 "zone_append": false, 00:22:03.341 "compare": false, 00:22:03.341 "compare_and_write": false, 00:22:03.341 "abort": true, 00:22:03.341 "seek_hole": false, 00:22:03.341 "seek_data": false, 00:22:03.341 "copy": true, 00:22:03.341 "nvme_iov_md": false 00:22:03.341 }, 00:22:03.341 "memory_domains": [ 00:22:03.341 { 00:22:03.341 "dma_device_id": "system", 00:22:03.341 "dma_device_type": 1 00:22:03.341 }, 00:22:03.341 { 00:22:03.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.341 "dma_device_type": 2 00:22:03.341 } 00:22:03.341 ], 00:22:03.341 "driver_specific": {} 00:22:03.341 } 00:22:03.341 ] 00:22:03.341 00:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:03.341 00:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:03.341 00:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:03.341 00:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:03.600 BaseBdev3 00:22:03.600 00:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:03.600 00:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:03.600 00:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:03.600 00:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:03.600 00:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:03.600 00:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:03.600 00:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:03.860 00:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:04.120 [ 00:22:04.120 { 00:22:04.120 "name": "BaseBdev3", 00:22:04.120 "aliases": [ 00:22:04.120 "d5b7adb9-9c17-4950-99bf-f96b3ca5e868" 00:22:04.120 ], 00:22:04.120 "product_name": "Malloc disk", 00:22:04.120 "block_size": 512, 00:22:04.120 "num_blocks": 65536, 00:22:04.120 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:04.120 "assigned_rate_limits": { 00:22:04.120 "rw_ios_per_sec": 0, 00:22:04.120 "rw_mbytes_per_sec": 0, 00:22:04.120 "r_mbytes_per_sec": 0, 00:22:04.120 "w_mbytes_per_sec": 0 00:22:04.120 }, 00:22:04.120 "claimed": false, 00:22:04.120 "zoned": false, 00:22:04.120 "supported_io_types": { 00:22:04.120 "read": true, 00:22:04.120 "write": true, 00:22:04.120 "unmap": true, 00:22:04.120 "flush": true, 00:22:04.120 "reset": true, 00:22:04.120 "nvme_admin": false, 00:22:04.120 "nvme_io": false, 00:22:04.120 "nvme_io_md": false, 00:22:04.120 "write_zeroes": true, 00:22:04.120 "zcopy": true, 00:22:04.120 "get_zone_info": false, 00:22:04.120 "zone_management": false, 00:22:04.120 "zone_append": false, 00:22:04.120 "compare": false, 00:22:04.120 "compare_and_write": false, 00:22:04.120 "abort": true, 00:22:04.120 "seek_hole": false, 00:22:04.120 "seek_data": false, 00:22:04.120 "copy": true, 00:22:04.120 "nvme_iov_md": false 00:22:04.120 }, 00:22:04.120 "memory_domains": [ 00:22:04.120 { 00:22:04.120 "dma_device_id": "system", 00:22:04.120 "dma_device_type": 1 00:22:04.120 }, 00:22:04.120 { 00:22:04.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.120 "dma_device_type": 2 00:22:04.120 } 00:22:04.120 ], 00:22:04.120 "driver_specific": {} 00:22:04.120 } 00:22:04.120 ] 00:22:04.120 00:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:04.120 00:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:04.120 00:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:04.120 00:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:04.379 BaseBdev4 00:22:04.379 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:04.379 00:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:04.379 00:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:04.379 00:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:04.379 00:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:04.379 00:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:04.379 00:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:04.639 00:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:04.898 [ 00:22:04.898 { 00:22:04.898 "name": "BaseBdev4", 00:22:04.898 "aliases": [ 00:22:04.898 "d8f4fa61-3116-4faa-b678-74b9b767729b" 00:22:04.898 ], 00:22:04.898 "product_name": "Malloc disk", 00:22:04.898 "block_size": 512, 00:22:04.898 "num_blocks": 65536, 00:22:04.898 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:04.898 "assigned_rate_limits": { 00:22:04.898 "rw_ios_per_sec": 0, 00:22:04.898 "rw_mbytes_per_sec": 0, 00:22:04.898 "r_mbytes_per_sec": 0, 00:22:04.898 "w_mbytes_per_sec": 0 00:22:04.898 }, 00:22:04.898 "claimed": false, 00:22:04.898 "zoned": false, 00:22:04.898 "supported_io_types": { 00:22:04.898 "read": true, 00:22:04.898 "write": true, 00:22:04.898 "unmap": true, 00:22:04.898 "flush": true, 00:22:04.898 "reset": true, 00:22:04.898 "nvme_admin": false, 00:22:04.898 "nvme_io": false, 00:22:04.898 "nvme_io_md": false, 00:22:04.898 "write_zeroes": true, 00:22:04.898 "zcopy": true, 00:22:04.898 "get_zone_info": false, 00:22:04.898 "zone_management": false, 00:22:04.898 "zone_append": false, 00:22:04.898 "compare": false, 00:22:04.898 "compare_and_write": false, 00:22:04.898 "abort": true, 00:22:04.898 "seek_hole": false, 00:22:04.898 "seek_data": false, 00:22:04.898 "copy": true, 00:22:04.898 "nvme_iov_md": false 00:22:04.898 }, 00:22:04.898 "memory_domains": [ 00:22:04.898 { 00:22:04.898 "dma_device_id": "system", 00:22:04.898 "dma_device_type": 1 00:22:04.898 }, 00:22:04.898 { 00:22:04.898 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.898 "dma_device_type": 2 00:22:04.898 } 00:22:04.898 ], 00:22:04.898 "driver_specific": {} 00:22:04.898 } 00:22:04.898 ] 00:22:04.898 00:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:04.898 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:04.898 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:04.898 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:05.157 [2024-07-16 00:16:51.913034] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:05.157 [2024-07-16 00:16:51.913073] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:05.157 [2024-07-16 00:16:51.913092] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:05.157 [2024-07-16 00:16:51.914452] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:05.157 [2024-07-16 00:16:51.914499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.157 00:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:05.416 00:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:05.416 "name": "Existed_Raid", 00:22:05.416 "uuid": "7185f416-6801-4f7f-a42b-88cc37770f1f", 00:22:05.416 "strip_size_kb": 0, 00:22:05.416 "state": "configuring", 00:22:05.416 "raid_level": "raid1", 00:22:05.416 "superblock": true, 00:22:05.416 "num_base_bdevs": 4, 00:22:05.416 "num_base_bdevs_discovered": 3, 00:22:05.416 "num_base_bdevs_operational": 4, 00:22:05.416 "base_bdevs_list": [ 00:22:05.416 { 00:22:05.416 "name": "BaseBdev1", 00:22:05.416 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:05.416 "is_configured": false, 00:22:05.416 "data_offset": 0, 00:22:05.416 "data_size": 0 00:22:05.416 }, 00:22:05.416 { 00:22:05.416 "name": "BaseBdev2", 00:22:05.416 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:05.416 "is_configured": true, 00:22:05.416 "data_offset": 2048, 00:22:05.416 "data_size": 63488 00:22:05.416 }, 00:22:05.416 { 00:22:05.416 "name": "BaseBdev3", 00:22:05.416 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:05.416 "is_configured": true, 00:22:05.416 "data_offset": 2048, 00:22:05.416 "data_size": 63488 00:22:05.416 }, 00:22:05.416 { 00:22:05.416 "name": "BaseBdev4", 00:22:05.416 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:05.416 "is_configured": true, 00:22:05.416 "data_offset": 2048, 00:22:05.416 "data_size": 63488 00:22:05.416 } 00:22:05.416 ] 00:22:05.416 }' 00:22:05.417 00:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:05.417 00:16:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:05.985 00:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:06.244 [2024-07-16 00:16:53.023942] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.244 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:06.502 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.502 "name": "Existed_Raid", 00:22:06.502 "uuid": "7185f416-6801-4f7f-a42b-88cc37770f1f", 00:22:06.502 "strip_size_kb": 0, 00:22:06.502 "state": "configuring", 00:22:06.502 "raid_level": "raid1", 00:22:06.502 "superblock": true, 00:22:06.502 "num_base_bdevs": 4, 00:22:06.502 "num_base_bdevs_discovered": 2, 00:22:06.502 "num_base_bdevs_operational": 4, 00:22:06.502 "base_bdevs_list": [ 00:22:06.502 { 00:22:06.502 "name": "BaseBdev1", 00:22:06.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.502 "is_configured": false, 00:22:06.502 "data_offset": 0, 00:22:06.502 "data_size": 0 00:22:06.502 }, 00:22:06.502 { 00:22:06.502 "name": null, 00:22:06.502 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:06.502 "is_configured": false, 00:22:06.502 "data_offset": 2048, 00:22:06.502 "data_size": 63488 00:22:06.502 }, 00:22:06.502 { 00:22:06.502 "name": "BaseBdev3", 00:22:06.502 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:06.502 "is_configured": true, 00:22:06.502 "data_offset": 2048, 00:22:06.502 "data_size": 63488 00:22:06.502 }, 00:22:06.502 { 00:22:06.502 "name": "BaseBdev4", 00:22:06.502 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:06.502 "is_configured": true, 00:22:06.502 "data_offset": 2048, 00:22:06.502 "data_size": 63488 00:22:06.502 } 00:22:06.502 ] 00:22:06.502 }' 00:22:06.502 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.502 00:16:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:07.068 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:07.068 00:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.327 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:07.327 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:07.585 [2024-07-16 00:16:54.376101] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:07.585 BaseBdev1 00:22:07.585 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:07.585 00:16:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:07.585 00:16:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:07.585 00:16:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:07.586 00:16:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:07.586 00:16:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:07.586 00:16:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:07.844 00:16:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:08.102 [ 00:22:08.102 { 00:22:08.102 "name": "BaseBdev1", 00:22:08.102 "aliases": [ 00:22:08.102 "56fa3d6c-9c91-41f6-80ed-5f2535c48b35" 00:22:08.102 ], 00:22:08.102 "product_name": "Malloc disk", 00:22:08.102 "block_size": 512, 00:22:08.102 "num_blocks": 65536, 00:22:08.102 "uuid": "56fa3d6c-9c91-41f6-80ed-5f2535c48b35", 00:22:08.102 "assigned_rate_limits": { 00:22:08.102 "rw_ios_per_sec": 0, 00:22:08.102 "rw_mbytes_per_sec": 0, 00:22:08.102 "r_mbytes_per_sec": 0, 00:22:08.102 "w_mbytes_per_sec": 0 00:22:08.102 }, 00:22:08.102 "claimed": true, 00:22:08.102 "claim_type": "exclusive_write", 00:22:08.102 "zoned": false, 00:22:08.102 "supported_io_types": { 00:22:08.102 "read": true, 00:22:08.102 "write": true, 00:22:08.102 "unmap": true, 00:22:08.102 "flush": true, 00:22:08.102 "reset": true, 00:22:08.102 "nvme_admin": false, 00:22:08.102 "nvme_io": false, 00:22:08.102 "nvme_io_md": false, 00:22:08.102 "write_zeroes": true, 00:22:08.102 "zcopy": true, 00:22:08.102 "get_zone_info": false, 00:22:08.102 "zone_management": false, 00:22:08.102 "zone_append": false, 00:22:08.102 "compare": false, 00:22:08.102 "compare_and_write": false, 00:22:08.102 "abort": true, 00:22:08.102 "seek_hole": false, 00:22:08.102 "seek_data": false, 00:22:08.102 "copy": true, 00:22:08.102 "nvme_iov_md": false 00:22:08.102 }, 00:22:08.102 "memory_domains": [ 00:22:08.102 { 00:22:08.102 "dma_device_id": "system", 00:22:08.102 "dma_device_type": 1 00:22:08.102 }, 00:22:08.102 { 00:22:08.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:08.102 "dma_device_type": 2 00:22:08.102 } 00:22:08.102 ], 00:22:08.102 "driver_specific": {} 00:22:08.102 } 00:22:08.102 ] 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.102 00:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:08.361 00:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:08.361 "name": "Existed_Raid", 00:22:08.361 "uuid": "7185f416-6801-4f7f-a42b-88cc37770f1f", 00:22:08.361 "strip_size_kb": 0, 00:22:08.361 "state": "configuring", 00:22:08.361 "raid_level": "raid1", 00:22:08.361 "superblock": true, 00:22:08.361 "num_base_bdevs": 4, 00:22:08.361 "num_base_bdevs_discovered": 3, 00:22:08.361 "num_base_bdevs_operational": 4, 00:22:08.361 "base_bdevs_list": [ 00:22:08.361 { 00:22:08.361 "name": "BaseBdev1", 00:22:08.361 "uuid": "56fa3d6c-9c91-41f6-80ed-5f2535c48b35", 00:22:08.361 "is_configured": true, 00:22:08.361 "data_offset": 2048, 00:22:08.361 "data_size": 63488 00:22:08.361 }, 00:22:08.361 { 00:22:08.361 "name": null, 00:22:08.361 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:08.361 "is_configured": false, 00:22:08.361 "data_offset": 2048, 00:22:08.361 "data_size": 63488 00:22:08.361 }, 00:22:08.361 { 00:22:08.361 "name": "BaseBdev3", 00:22:08.361 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:08.361 "is_configured": true, 00:22:08.361 "data_offset": 2048, 00:22:08.361 "data_size": 63488 00:22:08.361 }, 00:22:08.361 { 00:22:08.361 "name": "BaseBdev4", 00:22:08.361 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:08.361 "is_configured": true, 00:22:08.361 "data_offset": 2048, 00:22:08.361 "data_size": 63488 00:22:08.361 } 00:22:08.361 ] 00:22:08.361 }' 00:22:08.361 00:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:08.361 00:16:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:08.964 00:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.964 00:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:09.222 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:09.222 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:09.480 [2024-07-16 00:16:56.253182] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.480 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:09.738 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:09.738 "name": "Existed_Raid", 00:22:09.738 "uuid": "7185f416-6801-4f7f-a42b-88cc37770f1f", 00:22:09.738 "strip_size_kb": 0, 00:22:09.738 "state": "configuring", 00:22:09.738 "raid_level": "raid1", 00:22:09.738 "superblock": true, 00:22:09.738 "num_base_bdevs": 4, 00:22:09.738 "num_base_bdevs_discovered": 2, 00:22:09.738 "num_base_bdevs_operational": 4, 00:22:09.738 "base_bdevs_list": [ 00:22:09.738 { 00:22:09.738 "name": "BaseBdev1", 00:22:09.738 "uuid": "56fa3d6c-9c91-41f6-80ed-5f2535c48b35", 00:22:09.738 "is_configured": true, 00:22:09.738 "data_offset": 2048, 00:22:09.738 "data_size": 63488 00:22:09.738 }, 00:22:09.738 { 00:22:09.738 "name": null, 00:22:09.738 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:09.738 "is_configured": false, 00:22:09.738 "data_offset": 2048, 00:22:09.738 "data_size": 63488 00:22:09.738 }, 00:22:09.738 { 00:22:09.738 "name": null, 00:22:09.738 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:09.738 "is_configured": false, 00:22:09.738 "data_offset": 2048, 00:22:09.738 "data_size": 63488 00:22:09.738 }, 00:22:09.738 { 00:22:09.738 "name": "BaseBdev4", 00:22:09.738 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:09.738 "is_configured": true, 00:22:09.738 "data_offset": 2048, 00:22:09.738 "data_size": 63488 00:22:09.738 } 00:22:09.738 ] 00:22:09.738 }' 00:22:09.738 00:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:09.738 00:16:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:10.673 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.673 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:10.930 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:10.930 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:11.190 [2024-07-16 00:16:57.977794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:11.190 00:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.190 00:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:11.449 00:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:11.449 "name": "Existed_Raid", 00:22:11.449 "uuid": "7185f416-6801-4f7f-a42b-88cc37770f1f", 00:22:11.449 "strip_size_kb": 0, 00:22:11.449 "state": "configuring", 00:22:11.449 "raid_level": "raid1", 00:22:11.449 "superblock": true, 00:22:11.449 "num_base_bdevs": 4, 00:22:11.449 "num_base_bdevs_discovered": 3, 00:22:11.449 "num_base_bdevs_operational": 4, 00:22:11.449 "base_bdevs_list": [ 00:22:11.449 { 00:22:11.449 "name": "BaseBdev1", 00:22:11.449 "uuid": "56fa3d6c-9c91-41f6-80ed-5f2535c48b35", 00:22:11.449 "is_configured": true, 00:22:11.449 "data_offset": 2048, 00:22:11.449 "data_size": 63488 00:22:11.449 }, 00:22:11.449 { 00:22:11.449 "name": null, 00:22:11.449 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:11.449 "is_configured": false, 00:22:11.449 "data_offset": 2048, 00:22:11.449 "data_size": 63488 00:22:11.449 }, 00:22:11.449 { 00:22:11.449 "name": "BaseBdev3", 00:22:11.449 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:11.449 "is_configured": true, 00:22:11.449 "data_offset": 2048, 00:22:11.449 "data_size": 63488 00:22:11.449 }, 00:22:11.449 { 00:22:11.449 "name": "BaseBdev4", 00:22:11.449 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:11.449 "is_configured": true, 00:22:11.449 "data_offset": 2048, 00:22:11.449 "data_size": 63488 00:22:11.449 } 00:22:11.449 ] 00:22:11.449 }' 00:22:11.449 00:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:11.449 00:16:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:12.016 00:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.016 00:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:12.275 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:12.275 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:12.534 [2024-07-16 00:16:59.241178] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.534 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:12.793 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.793 "name": "Existed_Raid", 00:22:12.793 "uuid": "7185f416-6801-4f7f-a42b-88cc37770f1f", 00:22:12.793 "strip_size_kb": 0, 00:22:12.793 "state": "configuring", 00:22:12.793 "raid_level": "raid1", 00:22:12.793 "superblock": true, 00:22:12.793 "num_base_bdevs": 4, 00:22:12.793 "num_base_bdevs_discovered": 2, 00:22:12.793 "num_base_bdevs_operational": 4, 00:22:12.793 "base_bdevs_list": [ 00:22:12.793 { 00:22:12.793 "name": null, 00:22:12.793 "uuid": "56fa3d6c-9c91-41f6-80ed-5f2535c48b35", 00:22:12.793 "is_configured": false, 00:22:12.793 "data_offset": 2048, 00:22:12.793 "data_size": 63488 00:22:12.793 }, 00:22:12.793 { 00:22:12.793 "name": null, 00:22:12.793 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:12.793 "is_configured": false, 00:22:12.793 "data_offset": 2048, 00:22:12.793 "data_size": 63488 00:22:12.793 }, 00:22:12.793 { 00:22:12.793 "name": "BaseBdev3", 00:22:12.793 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:12.793 "is_configured": true, 00:22:12.793 "data_offset": 2048, 00:22:12.793 "data_size": 63488 00:22:12.793 }, 00:22:12.793 { 00:22:12.793 "name": "BaseBdev4", 00:22:12.793 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:12.793 "is_configured": true, 00:22:12.793 "data_offset": 2048, 00:22:12.793 "data_size": 63488 00:22:12.793 } 00:22:12.793 ] 00:22:12.793 }' 00:22:12.793 00:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.793 00:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:13.360 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:13.360 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.361 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:13.361 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:13.619 [2024-07-16 00:17:00.511288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.619 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:13.878 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:13.878 "name": "Existed_Raid", 00:22:13.878 "uuid": "7185f416-6801-4f7f-a42b-88cc37770f1f", 00:22:13.878 "strip_size_kb": 0, 00:22:13.878 "state": "configuring", 00:22:13.878 "raid_level": "raid1", 00:22:13.878 "superblock": true, 00:22:13.878 "num_base_bdevs": 4, 00:22:13.878 "num_base_bdevs_discovered": 3, 00:22:13.878 "num_base_bdevs_operational": 4, 00:22:13.878 "base_bdevs_list": [ 00:22:13.878 { 00:22:13.878 "name": null, 00:22:13.878 "uuid": "56fa3d6c-9c91-41f6-80ed-5f2535c48b35", 00:22:13.878 "is_configured": false, 00:22:13.878 "data_offset": 2048, 00:22:13.878 "data_size": 63488 00:22:13.878 }, 00:22:13.878 { 00:22:13.878 "name": "BaseBdev2", 00:22:13.878 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:13.878 "is_configured": true, 00:22:13.878 "data_offset": 2048, 00:22:13.878 "data_size": 63488 00:22:13.878 }, 00:22:13.878 { 00:22:13.878 "name": "BaseBdev3", 00:22:13.878 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:13.878 "is_configured": true, 00:22:13.878 "data_offset": 2048, 00:22:13.878 "data_size": 63488 00:22:13.878 }, 00:22:13.878 { 00:22:13.878 "name": "BaseBdev4", 00:22:13.878 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:13.878 "is_configured": true, 00:22:13.878 "data_offset": 2048, 00:22:13.878 "data_size": 63488 00:22:13.878 } 00:22:13.878 ] 00:22:13.878 }' 00:22:13.878 00:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:13.878 00:17:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:14.446 00:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.446 00:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:14.704 00:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:14.705 00:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.705 00:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:14.963 00:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 56fa3d6c-9c91-41f6-80ed-5f2535c48b35 00:22:15.223 [2024-07-16 00:17:02.088083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:15.223 [2024-07-16 00:17:02.088250] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23b6180 00:22:15.223 [2024-07-16 00:17:02.088263] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:15.223 [2024-07-16 00:17:02.088436] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23b6c20 00:22:15.223 [2024-07-16 00:17:02.088564] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23b6180 00:22:15.223 [2024-07-16 00:17:02.088575] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23b6180 00:22:15.223 [2024-07-16 00:17:02.088670] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:15.223 NewBaseBdev 00:22:15.223 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:15.223 00:17:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:15.223 00:17:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:15.223 00:17:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:15.223 00:17:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:15.223 00:17:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:15.223 00:17:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:15.480 00:17:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:15.739 [ 00:22:15.739 { 00:22:15.739 "name": "NewBaseBdev", 00:22:15.739 "aliases": [ 00:22:15.739 "56fa3d6c-9c91-41f6-80ed-5f2535c48b35" 00:22:15.739 ], 00:22:15.739 "product_name": "Malloc disk", 00:22:15.739 "block_size": 512, 00:22:15.739 "num_blocks": 65536, 00:22:15.739 "uuid": "56fa3d6c-9c91-41f6-80ed-5f2535c48b35", 00:22:15.739 "assigned_rate_limits": { 00:22:15.739 "rw_ios_per_sec": 0, 00:22:15.739 "rw_mbytes_per_sec": 0, 00:22:15.739 "r_mbytes_per_sec": 0, 00:22:15.739 "w_mbytes_per_sec": 0 00:22:15.739 }, 00:22:15.739 "claimed": true, 00:22:15.739 "claim_type": "exclusive_write", 00:22:15.739 "zoned": false, 00:22:15.739 "supported_io_types": { 00:22:15.739 "read": true, 00:22:15.739 "write": true, 00:22:15.739 "unmap": true, 00:22:15.739 "flush": true, 00:22:15.739 "reset": true, 00:22:15.739 "nvme_admin": false, 00:22:15.739 "nvme_io": false, 00:22:15.739 "nvme_io_md": false, 00:22:15.739 "write_zeroes": true, 00:22:15.739 "zcopy": true, 00:22:15.739 "get_zone_info": false, 00:22:15.739 "zone_management": false, 00:22:15.739 "zone_append": false, 00:22:15.739 "compare": false, 00:22:15.739 "compare_and_write": false, 00:22:15.739 "abort": true, 00:22:15.739 "seek_hole": false, 00:22:15.739 "seek_data": false, 00:22:15.739 "copy": true, 00:22:15.739 "nvme_iov_md": false 00:22:15.739 }, 00:22:15.739 "memory_domains": [ 00:22:15.739 { 00:22:15.739 "dma_device_id": "system", 00:22:15.739 "dma_device_type": 1 00:22:15.739 }, 00:22:15.739 { 00:22:15.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:15.739 "dma_device_type": 2 00:22:15.739 } 00:22:15.739 ], 00:22:15.739 "driver_specific": {} 00:22:15.739 } 00:22:15.739 ] 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.739 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:15.998 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.998 "name": "Existed_Raid", 00:22:15.998 "uuid": "7185f416-6801-4f7f-a42b-88cc37770f1f", 00:22:15.998 "strip_size_kb": 0, 00:22:15.998 "state": "online", 00:22:15.998 "raid_level": "raid1", 00:22:15.998 "superblock": true, 00:22:15.998 "num_base_bdevs": 4, 00:22:15.998 "num_base_bdevs_discovered": 4, 00:22:15.998 "num_base_bdevs_operational": 4, 00:22:15.998 "base_bdevs_list": [ 00:22:15.998 { 00:22:15.998 "name": "NewBaseBdev", 00:22:15.998 "uuid": "56fa3d6c-9c91-41f6-80ed-5f2535c48b35", 00:22:15.998 "is_configured": true, 00:22:15.998 "data_offset": 2048, 00:22:15.998 "data_size": 63488 00:22:15.998 }, 00:22:15.998 { 00:22:15.998 "name": "BaseBdev2", 00:22:15.998 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:15.998 "is_configured": true, 00:22:15.998 "data_offset": 2048, 00:22:15.998 "data_size": 63488 00:22:15.998 }, 00:22:15.998 { 00:22:15.998 "name": "BaseBdev3", 00:22:15.998 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:15.998 "is_configured": true, 00:22:15.998 "data_offset": 2048, 00:22:15.998 "data_size": 63488 00:22:15.998 }, 00:22:15.998 { 00:22:15.998 "name": "BaseBdev4", 00:22:15.998 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:15.998 "is_configured": true, 00:22:15.998 "data_offset": 2048, 00:22:15.998 "data_size": 63488 00:22:15.998 } 00:22:15.998 ] 00:22:15.998 }' 00:22:15.998 00:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.998 00:17:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:16.565 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:16.565 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:16.565 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:16.565 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:16.565 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:16.565 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:16.565 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:16.565 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:16.824 [2024-07-16 00:17:03.636535] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:16.824 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:16.824 "name": "Existed_Raid", 00:22:16.824 "aliases": [ 00:22:16.824 "7185f416-6801-4f7f-a42b-88cc37770f1f" 00:22:16.824 ], 00:22:16.824 "product_name": "Raid Volume", 00:22:16.824 "block_size": 512, 00:22:16.824 "num_blocks": 63488, 00:22:16.824 "uuid": "7185f416-6801-4f7f-a42b-88cc37770f1f", 00:22:16.824 "assigned_rate_limits": { 00:22:16.824 "rw_ios_per_sec": 0, 00:22:16.824 "rw_mbytes_per_sec": 0, 00:22:16.824 "r_mbytes_per_sec": 0, 00:22:16.824 "w_mbytes_per_sec": 0 00:22:16.824 }, 00:22:16.824 "claimed": false, 00:22:16.824 "zoned": false, 00:22:16.824 "supported_io_types": { 00:22:16.824 "read": true, 00:22:16.824 "write": true, 00:22:16.824 "unmap": false, 00:22:16.824 "flush": false, 00:22:16.824 "reset": true, 00:22:16.824 "nvme_admin": false, 00:22:16.824 "nvme_io": false, 00:22:16.824 "nvme_io_md": false, 00:22:16.824 "write_zeroes": true, 00:22:16.824 "zcopy": false, 00:22:16.824 "get_zone_info": false, 00:22:16.824 "zone_management": false, 00:22:16.824 "zone_append": false, 00:22:16.824 "compare": false, 00:22:16.824 "compare_and_write": false, 00:22:16.824 "abort": false, 00:22:16.824 "seek_hole": false, 00:22:16.824 "seek_data": false, 00:22:16.824 "copy": false, 00:22:16.824 "nvme_iov_md": false 00:22:16.824 }, 00:22:16.824 "memory_domains": [ 00:22:16.824 { 00:22:16.824 "dma_device_id": "system", 00:22:16.824 "dma_device_type": 1 00:22:16.824 }, 00:22:16.824 { 00:22:16.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.824 "dma_device_type": 2 00:22:16.824 }, 00:22:16.824 { 00:22:16.824 "dma_device_id": "system", 00:22:16.824 "dma_device_type": 1 00:22:16.824 }, 00:22:16.824 { 00:22:16.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.824 "dma_device_type": 2 00:22:16.824 }, 00:22:16.824 { 00:22:16.824 "dma_device_id": "system", 00:22:16.824 "dma_device_type": 1 00:22:16.824 }, 00:22:16.824 { 00:22:16.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.824 "dma_device_type": 2 00:22:16.824 }, 00:22:16.824 { 00:22:16.824 "dma_device_id": "system", 00:22:16.824 "dma_device_type": 1 00:22:16.824 }, 00:22:16.824 { 00:22:16.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.824 "dma_device_type": 2 00:22:16.824 } 00:22:16.824 ], 00:22:16.824 "driver_specific": { 00:22:16.824 "raid": { 00:22:16.824 "uuid": "7185f416-6801-4f7f-a42b-88cc37770f1f", 00:22:16.824 "strip_size_kb": 0, 00:22:16.824 "state": "online", 00:22:16.824 "raid_level": "raid1", 00:22:16.824 "superblock": true, 00:22:16.824 "num_base_bdevs": 4, 00:22:16.824 "num_base_bdevs_discovered": 4, 00:22:16.824 "num_base_bdevs_operational": 4, 00:22:16.824 "base_bdevs_list": [ 00:22:16.824 { 00:22:16.824 "name": "NewBaseBdev", 00:22:16.824 "uuid": "56fa3d6c-9c91-41f6-80ed-5f2535c48b35", 00:22:16.824 "is_configured": true, 00:22:16.824 "data_offset": 2048, 00:22:16.824 "data_size": 63488 00:22:16.824 }, 00:22:16.824 { 00:22:16.824 "name": "BaseBdev2", 00:22:16.824 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:16.824 "is_configured": true, 00:22:16.824 "data_offset": 2048, 00:22:16.824 "data_size": 63488 00:22:16.824 }, 00:22:16.824 { 00:22:16.824 "name": "BaseBdev3", 00:22:16.824 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:16.824 "is_configured": true, 00:22:16.824 "data_offset": 2048, 00:22:16.824 "data_size": 63488 00:22:16.824 }, 00:22:16.824 { 00:22:16.824 "name": "BaseBdev4", 00:22:16.824 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:16.824 "is_configured": true, 00:22:16.824 "data_offset": 2048, 00:22:16.824 "data_size": 63488 00:22:16.824 } 00:22:16.824 ] 00:22:16.824 } 00:22:16.824 } 00:22:16.824 }' 00:22:16.824 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:16.824 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:16.824 BaseBdev2 00:22:16.824 BaseBdev3 00:22:16.824 BaseBdev4' 00:22:16.824 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:16.824 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:16.824 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:17.083 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:17.083 "name": "NewBaseBdev", 00:22:17.083 "aliases": [ 00:22:17.083 "56fa3d6c-9c91-41f6-80ed-5f2535c48b35" 00:22:17.083 ], 00:22:17.083 "product_name": "Malloc disk", 00:22:17.083 "block_size": 512, 00:22:17.083 "num_blocks": 65536, 00:22:17.083 "uuid": "56fa3d6c-9c91-41f6-80ed-5f2535c48b35", 00:22:17.083 "assigned_rate_limits": { 00:22:17.083 "rw_ios_per_sec": 0, 00:22:17.083 "rw_mbytes_per_sec": 0, 00:22:17.083 "r_mbytes_per_sec": 0, 00:22:17.083 "w_mbytes_per_sec": 0 00:22:17.083 }, 00:22:17.083 "claimed": true, 00:22:17.083 "claim_type": "exclusive_write", 00:22:17.083 "zoned": false, 00:22:17.083 "supported_io_types": { 00:22:17.083 "read": true, 00:22:17.083 "write": true, 00:22:17.083 "unmap": true, 00:22:17.083 "flush": true, 00:22:17.083 "reset": true, 00:22:17.083 "nvme_admin": false, 00:22:17.083 "nvme_io": false, 00:22:17.083 "nvme_io_md": false, 00:22:17.083 "write_zeroes": true, 00:22:17.083 "zcopy": true, 00:22:17.083 "get_zone_info": false, 00:22:17.083 "zone_management": false, 00:22:17.083 "zone_append": false, 00:22:17.083 "compare": false, 00:22:17.083 "compare_and_write": false, 00:22:17.083 "abort": true, 00:22:17.083 "seek_hole": false, 00:22:17.083 "seek_data": false, 00:22:17.083 "copy": true, 00:22:17.083 "nvme_iov_md": false 00:22:17.083 }, 00:22:17.083 "memory_domains": [ 00:22:17.083 { 00:22:17.083 "dma_device_id": "system", 00:22:17.083 "dma_device_type": 1 00:22:17.083 }, 00:22:17.083 { 00:22:17.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:17.083 "dma_device_type": 2 00:22:17.083 } 00:22:17.083 ], 00:22:17.083 "driver_specific": {} 00:22:17.083 }' 00:22:17.083 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:17.083 00:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:17.083 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:17.083 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:17.341 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:17.341 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:17.341 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:17.341 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:17.341 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:17.341 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:17.341 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:17.599 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:17.599 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:17.599 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:17.599 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:17.599 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:17.599 "name": "BaseBdev2", 00:22:17.599 "aliases": [ 00:22:17.599 "895e848c-fce0-4331-a9ab-108f58d83952" 00:22:17.599 ], 00:22:17.599 "product_name": "Malloc disk", 00:22:17.599 "block_size": 512, 00:22:17.599 "num_blocks": 65536, 00:22:17.599 "uuid": "895e848c-fce0-4331-a9ab-108f58d83952", 00:22:17.599 "assigned_rate_limits": { 00:22:17.599 "rw_ios_per_sec": 0, 00:22:17.599 "rw_mbytes_per_sec": 0, 00:22:17.599 "r_mbytes_per_sec": 0, 00:22:17.599 "w_mbytes_per_sec": 0 00:22:17.599 }, 00:22:17.599 "claimed": true, 00:22:17.599 "claim_type": "exclusive_write", 00:22:17.599 "zoned": false, 00:22:17.599 "supported_io_types": { 00:22:17.599 "read": true, 00:22:17.599 "write": true, 00:22:17.599 "unmap": true, 00:22:17.599 "flush": true, 00:22:17.600 "reset": true, 00:22:17.600 "nvme_admin": false, 00:22:17.600 "nvme_io": false, 00:22:17.600 "nvme_io_md": false, 00:22:17.600 "write_zeroes": true, 00:22:17.600 "zcopy": true, 00:22:17.600 "get_zone_info": false, 00:22:17.600 "zone_management": false, 00:22:17.600 "zone_append": false, 00:22:17.600 "compare": false, 00:22:17.600 "compare_and_write": false, 00:22:17.600 "abort": true, 00:22:17.600 "seek_hole": false, 00:22:17.600 "seek_data": false, 00:22:17.600 "copy": true, 00:22:17.600 "nvme_iov_md": false 00:22:17.600 }, 00:22:17.600 "memory_domains": [ 00:22:17.600 { 00:22:17.600 "dma_device_id": "system", 00:22:17.600 "dma_device_type": 1 00:22:17.600 }, 00:22:17.600 { 00:22:17.600 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:17.600 "dma_device_type": 2 00:22:17.600 } 00:22:17.600 ], 00:22:17.600 "driver_specific": {} 00:22:17.600 }' 00:22:17.600 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:17.600 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:17.858 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:17.858 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:17.858 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:17.858 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:17.858 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:17.858 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:17.858 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:17.858 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:17.858 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:18.117 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:18.117 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:18.117 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:18.117 00:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:18.376 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:18.376 "name": "BaseBdev3", 00:22:18.376 "aliases": [ 00:22:18.376 "d5b7adb9-9c17-4950-99bf-f96b3ca5e868" 00:22:18.376 ], 00:22:18.376 "product_name": "Malloc disk", 00:22:18.376 "block_size": 512, 00:22:18.376 "num_blocks": 65536, 00:22:18.376 "uuid": "d5b7adb9-9c17-4950-99bf-f96b3ca5e868", 00:22:18.376 "assigned_rate_limits": { 00:22:18.376 "rw_ios_per_sec": 0, 00:22:18.376 "rw_mbytes_per_sec": 0, 00:22:18.376 "r_mbytes_per_sec": 0, 00:22:18.376 "w_mbytes_per_sec": 0 00:22:18.376 }, 00:22:18.376 "claimed": true, 00:22:18.376 "claim_type": "exclusive_write", 00:22:18.376 "zoned": false, 00:22:18.376 "supported_io_types": { 00:22:18.376 "read": true, 00:22:18.376 "write": true, 00:22:18.376 "unmap": true, 00:22:18.376 "flush": true, 00:22:18.376 "reset": true, 00:22:18.376 "nvme_admin": false, 00:22:18.376 "nvme_io": false, 00:22:18.376 "nvme_io_md": false, 00:22:18.376 "write_zeroes": true, 00:22:18.376 "zcopy": true, 00:22:18.376 "get_zone_info": false, 00:22:18.376 "zone_management": false, 00:22:18.376 "zone_append": false, 00:22:18.376 "compare": false, 00:22:18.376 "compare_and_write": false, 00:22:18.376 "abort": true, 00:22:18.376 "seek_hole": false, 00:22:18.376 "seek_data": false, 00:22:18.376 "copy": true, 00:22:18.376 "nvme_iov_md": false 00:22:18.376 }, 00:22:18.376 "memory_domains": [ 00:22:18.376 { 00:22:18.376 "dma_device_id": "system", 00:22:18.376 "dma_device_type": 1 00:22:18.376 }, 00:22:18.376 { 00:22:18.376 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:18.376 "dma_device_type": 2 00:22:18.376 } 00:22:18.376 ], 00:22:18.376 "driver_specific": {} 00:22:18.376 }' 00:22:18.376 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:18.376 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:18.376 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:18.376 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:18.376 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:18.376 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:18.376 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:18.376 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:18.635 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:18.635 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:18.635 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:18.635 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:18.635 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:18.635 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:18.635 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:18.894 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:18.894 "name": "BaseBdev4", 00:22:18.894 "aliases": [ 00:22:18.894 "d8f4fa61-3116-4faa-b678-74b9b767729b" 00:22:18.894 ], 00:22:18.894 "product_name": "Malloc disk", 00:22:18.894 "block_size": 512, 00:22:18.894 "num_blocks": 65536, 00:22:18.894 "uuid": "d8f4fa61-3116-4faa-b678-74b9b767729b", 00:22:18.894 "assigned_rate_limits": { 00:22:18.894 "rw_ios_per_sec": 0, 00:22:18.894 "rw_mbytes_per_sec": 0, 00:22:18.894 "r_mbytes_per_sec": 0, 00:22:18.894 "w_mbytes_per_sec": 0 00:22:18.894 }, 00:22:18.894 "claimed": true, 00:22:18.894 "claim_type": "exclusive_write", 00:22:18.894 "zoned": false, 00:22:18.894 "supported_io_types": { 00:22:18.894 "read": true, 00:22:18.894 "write": true, 00:22:18.894 "unmap": true, 00:22:18.894 "flush": true, 00:22:18.894 "reset": true, 00:22:18.894 "nvme_admin": false, 00:22:18.894 "nvme_io": false, 00:22:18.894 "nvme_io_md": false, 00:22:18.894 "write_zeroes": true, 00:22:18.894 "zcopy": true, 00:22:18.894 "get_zone_info": false, 00:22:18.894 "zone_management": false, 00:22:18.894 "zone_append": false, 00:22:18.894 "compare": false, 00:22:18.894 "compare_and_write": false, 00:22:18.894 "abort": true, 00:22:18.894 "seek_hole": false, 00:22:18.894 "seek_data": false, 00:22:18.894 "copy": true, 00:22:18.894 "nvme_iov_md": false 00:22:18.894 }, 00:22:18.894 "memory_domains": [ 00:22:18.894 { 00:22:18.894 "dma_device_id": "system", 00:22:18.894 "dma_device_type": 1 00:22:18.894 }, 00:22:18.894 { 00:22:18.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:18.894 "dma_device_type": 2 00:22:18.894 } 00:22:18.894 ], 00:22:18.894 "driver_specific": {} 00:22:18.894 }' 00:22:18.894 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:18.895 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:18.895 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:18.895 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:18.895 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:19.153 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:19.153 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:19.153 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:19.153 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:19.153 00:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:19.153 00:17:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:19.153 00:17:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:19.153 00:17:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:19.413 [2024-07-16 00:17:06.291274] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:19.413 [2024-07-16 00:17:06.291304] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:19.413 [2024-07-16 00:17:06.291353] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:19.413 [2024-07-16 00:17:06.291640] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:19.413 [2024-07-16 00:17:06.291653] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23b6180 name Existed_Raid, state offline 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3586811 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3586811 ']' 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3586811 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3586811 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3586811' 00:22:19.413 killing process with pid 3586811 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3586811 00:22:19.413 [2024-07-16 00:17:06.363046] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:19.413 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3586811 00:22:19.672 [2024-07-16 00:17:06.405526] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:19.931 00:17:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:22:19.931 00:22:19.931 real 0m33.034s 00:22:19.931 user 1m0.698s 00:22:19.931 sys 0m5.894s 00:22:19.931 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:19.931 00:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:19.931 ************************************ 00:22:19.931 END TEST raid_state_function_test_sb 00:22:19.931 ************************************ 00:22:19.931 00:17:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:19.931 00:17:06 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:22:19.931 00:17:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:19.931 00:17:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:19.931 00:17:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:19.931 ************************************ 00:22:19.931 START TEST raid_superblock_test 00:22:19.931 ************************************ 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3591694 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3591694 /var/tmp/spdk-raid.sock 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3591694 ']' 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:19.931 00:17:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:19.932 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:19.932 00:17:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:19.932 00:17:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:19.932 [2024-07-16 00:17:06.783229] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:22:19.932 [2024-07-16 00:17:06.783299] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3591694 ] 00:22:20.190 [2024-07-16 00:17:06.912599] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:20.190 [2024-07-16 00:17:07.015328] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:20.190 [2024-07-16 00:17:07.080966] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:20.190 [2024-07-16 00:17:07.080998] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:22:21.127 malloc1 00:22:21.127 00:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:21.386 [2024-07-16 00:17:08.206789] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:21.386 [2024-07-16 00:17:08.206840] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:21.386 [2024-07-16 00:17:08.206859] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd29570 00:22:21.386 [2024-07-16 00:17:08.206871] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:21.386 [2024-07-16 00:17:08.208485] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:21.386 [2024-07-16 00:17:08.208516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:21.386 pt1 00:22:21.386 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:21.386 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:21.386 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:21.386 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:21.386 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:21.386 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:21.386 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:21.386 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:21.386 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:21.646 malloc2 00:22:21.646 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:21.905 [2024-07-16 00:17:08.709150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:21.905 [2024-07-16 00:17:08.709194] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:21.905 [2024-07-16 00:17:08.709212] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd2a970 00:22:21.905 [2024-07-16 00:17:08.709224] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:21.905 [2024-07-16 00:17:08.710667] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:21.905 [2024-07-16 00:17:08.710701] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:21.905 pt2 00:22:21.905 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:21.905 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:21.905 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:22:21.905 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:22:21.905 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:22:21.905 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:21.905 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:21.905 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:21.905 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:22:22.164 malloc3 00:22:22.164 00:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:22.424 [2024-07-16 00:17:09.211151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:22.424 [2024-07-16 00:17:09.211197] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:22.424 [2024-07-16 00:17:09.211214] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec1340 00:22:22.424 [2024-07-16 00:17:09.211226] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:22.424 [2024-07-16 00:17:09.212704] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:22.424 [2024-07-16 00:17:09.212733] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:22.424 pt3 00:22:22.424 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:22.424 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:22.424 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:22:22.424 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:22:22.424 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:22:22.424 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:22.424 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:22.424 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:22.424 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:22:22.683 malloc4 00:22:22.683 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:22.979 [2024-07-16 00:17:09.716998] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:22.979 [2024-07-16 00:17:09.717045] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:22.979 [2024-07-16 00:17:09.717063] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec3c60 00:22:22.979 [2024-07-16 00:17:09.717076] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:22.979 [2024-07-16 00:17:09.718449] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:22.979 [2024-07-16 00:17:09.718477] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:22.979 pt4 00:22:22.979 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:22.979 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:22.979 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:22:23.264 [2024-07-16 00:17:09.969818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:23.264 [2024-07-16 00:17:09.971035] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:23.264 [2024-07-16 00:17:09.971089] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:23.264 [2024-07-16 00:17:09.971133] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:23.264 [2024-07-16 00:17:09.971297] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd21530 00:22:23.264 [2024-07-16 00:17:09.971309] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:23.264 [2024-07-16 00:17:09.971496] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd1f770 00:22:23.264 [2024-07-16 00:17:09.971645] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd21530 00:22:23.265 [2024-07-16 00:17:09.971656] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd21530 00:22:23.265 [2024-07-16 00:17:09.971747] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.265 00:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.524 00:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.524 "name": "raid_bdev1", 00:22:23.524 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:23.524 "strip_size_kb": 0, 00:22:23.524 "state": "online", 00:22:23.524 "raid_level": "raid1", 00:22:23.524 "superblock": true, 00:22:23.524 "num_base_bdevs": 4, 00:22:23.524 "num_base_bdevs_discovered": 4, 00:22:23.524 "num_base_bdevs_operational": 4, 00:22:23.524 "base_bdevs_list": [ 00:22:23.524 { 00:22:23.524 "name": "pt1", 00:22:23.524 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:23.524 "is_configured": true, 00:22:23.524 "data_offset": 2048, 00:22:23.524 "data_size": 63488 00:22:23.524 }, 00:22:23.524 { 00:22:23.524 "name": "pt2", 00:22:23.524 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:23.524 "is_configured": true, 00:22:23.524 "data_offset": 2048, 00:22:23.524 "data_size": 63488 00:22:23.524 }, 00:22:23.524 { 00:22:23.524 "name": "pt3", 00:22:23.524 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:23.524 "is_configured": true, 00:22:23.524 "data_offset": 2048, 00:22:23.524 "data_size": 63488 00:22:23.524 }, 00:22:23.524 { 00:22:23.524 "name": "pt4", 00:22:23.524 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:23.524 "is_configured": true, 00:22:23.524 "data_offset": 2048, 00:22:23.524 "data_size": 63488 00:22:23.524 } 00:22:23.524 ] 00:22:23.524 }' 00:22:23.524 00:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.524 00:17:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:24.092 00:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:24.092 00:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:24.092 00:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:24.092 00:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:24.092 00:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:24.092 00:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:24.092 00:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:24.092 00:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:24.351 [2024-07-16 00:17:11.077035] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:24.351 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:24.351 "name": "raid_bdev1", 00:22:24.351 "aliases": [ 00:22:24.351 "248e0554-c6c6-4f65-8795-5169d5aed9c1" 00:22:24.351 ], 00:22:24.351 "product_name": "Raid Volume", 00:22:24.351 "block_size": 512, 00:22:24.351 "num_blocks": 63488, 00:22:24.351 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:24.351 "assigned_rate_limits": { 00:22:24.351 "rw_ios_per_sec": 0, 00:22:24.351 "rw_mbytes_per_sec": 0, 00:22:24.351 "r_mbytes_per_sec": 0, 00:22:24.351 "w_mbytes_per_sec": 0 00:22:24.351 }, 00:22:24.351 "claimed": false, 00:22:24.351 "zoned": false, 00:22:24.351 "supported_io_types": { 00:22:24.351 "read": true, 00:22:24.351 "write": true, 00:22:24.351 "unmap": false, 00:22:24.351 "flush": false, 00:22:24.351 "reset": true, 00:22:24.351 "nvme_admin": false, 00:22:24.351 "nvme_io": false, 00:22:24.351 "nvme_io_md": false, 00:22:24.351 "write_zeroes": true, 00:22:24.351 "zcopy": false, 00:22:24.351 "get_zone_info": false, 00:22:24.351 "zone_management": false, 00:22:24.351 "zone_append": false, 00:22:24.351 "compare": false, 00:22:24.351 "compare_and_write": false, 00:22:24.351 "abort": false, 00:22:24.351 "seek_hole": false, 00:22:24.351 "seek_data": false, 00:22:24.351 "copy": false, 00:22:24.351 "nvme_iov_md": false 00:22:24.351 }, 00:22:24.351 "memory_domains": [ 00:22:24.351 { 00:22:24.351 "dma_device_id": "system", 00:22:24.351 "dma_device_type": 1 00:22:24.351 }, 00:22:24.351 { 00:22:24.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.351 "dma_device_type": 2 00:22:24.351 }, 00:22:24.351 { 00:22:24.351 "dma_device_id": "system", 00:22:24.351 "dma_device_type": 1 00:22:24.351 }, 00:22:24.351 { 00:22:24.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.351 "dma_device_type": 2 00:22:24.351 }, 00:22:24.351 { 00:22:24.351 "dma_device_id": "system", 00:22:24.351 "dma_device_type": 1 00:22:24.351 }, 00:22:24.351 { 00:22:24.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.351 "dma_device_type": 2 00:22:24.351 }, 00:22:24.351 { 00:22:24.351 "dma_device_id": "system", 00:22:24.351 "dma_device_type": 1 00:22:24.351 }, 00:22:24.351 { 00:22:24.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.351 "dma_device_type": 2 00:22:24.351 } 00:22:24.351 ], 00:22:24.351 "driver_specific": { 00:22:24.351 "raid": { 00:22:24.351 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:24.351 "strip_size_kb": 0, 00:22:24.351 "state": "online", 00:22:24.351 "raid_level": "raid1", 00:22:24.351 "superblock": true, 00:22:24.351 "num_base_bdevs": 4, 00:22:24.351 "num_base_bdevs_discovered": 4, 00:22:24.351 "num_base_bdevs_operational": 4, 00:22:24.351 "base_bdevs_list": [ 00:22:24.351 { 00:22:24.351 "name": "pt1", 00:22:24.351 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:24.351 "is_configured": true, 00:22:24.351 "data_offset": 2048, 00:22:24.351 "data_size": 63488 00:22:24.351 }, 00:22:24.351 { 00:22:24.351 "name": "pt2", 00:22:24.351 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:24.351 "is_configured": true, 00:22:24.351 "data_offset": 2048, 00:22:24.351 "data_size": 63488 00:22:24.351 }, 00:22:24.351 { 00:22:24.351 "name": "pt3", 00:22:24.351 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:24.351 "is_configured": true, 00:22:24.351 "data_offset": 2048, 00:22:24.351 "data_size": 63488 00:22:24.351 }, 00:22:24.351 { 00:22:24.351 "name": "pt4", 00:22:24.351 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:24.351 "is_configured": true, 00:22:24.351 "data_offset": 2048, 00:22:24.351 "data_size": 63488 00:22:24.351 } 00:22:24.351 ] 00:22:24.351 } 00:22:24.351 } 00:22:24.351 }' 00:22:24.352 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:24.352 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:24.352 pt2 00:22:24.352 pt3 00:22:24.352 pt4' 00:22:24.352 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:24.352 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:24.352 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:24.611 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:24.611 "name": "pt1", 00:22:24.611 "aliases": [ 00:22:24.611 "00000000-0000-0000-0000-000000000001" 00:22:24.611 ], 00:22:24.611 "product_name": "passthru", 00:22:24.611 "block_size": 512, 00:22:24.611 "num_blocks": 65536, 00:22:24.611 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:24.611 "assigned_rate_limits": { 00:22:24.611 "rw_ios_per_sec": 0, 00:22:24.611 "rw_mbytes_per_sec": 0, 00:22:24.611 "r_mbytes_per_sec": 0, 00:22:24.611 "w_mbytes_per_sec": 0 00:22:24.611 }, 00:22:24.611 "claimed": true, 00:22:24.611 "claim_type": "exclusive_write", 00:22:24.611 "zoned": false, 00:22:24.611 "supported_io_types": { 00:22:24.612 "read": true, 00:22:24.612 "write": true, 00:22:24.612 "unmap": true, 00:22:24.612 "flush": true, 00:22:24.612 "reset": true, 00:22:24.612 "nvme_admin": false, 00:22:24.612 "nvme_io": false, 00:22:24.612 "nvme_io_md": false, 00:22:24.612 "write_zeroes": true, 00:22:24.612 "zcopy": true, 00:22:24.612 "get_zone_info": false, 00:22:24.612 "zone_management": false, 00:22:24.612 "zone_append": false, 00:22:24.612 "compare": false, 00:22:24.612 "compare_and_write": false, 00:22:24.612 "abort": true, 00:22:24.612 "seek_hole": false, 00:22:24.612 "seek_data": false, 00:22:24.612 "copy": true, 00:22:24.612 "nvme_iov_md": false 00:22:24.612 }, 00:22:24.612 "memory_domains": [ 00:22:24.612 { 00:22:24.612 "dma_device_id": "system", 00:22:24.612 "dma_device_type": 1 00:22:24.612 }, 00:22:24.612 { 00:22:24.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.612 "dma_device_type": 2 00:22:24.612 } 00:22:24.612 ], 00:22:24.612 "driver_specific": { 00:22:24.612 "passthru": { 00:22:24.612 "name": "pt1", 00:22:24.612 "base_bdev_name": "malloc1" 00:22:24.612 } 00:22:24.612 } 00:22:24.612 }' 00:22:24.612 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.612 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.612 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:24.612 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.612 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.870 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:24.870 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.870 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.870 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:24.870 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.870 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.870 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:24.870 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:24.870 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:24.870 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:25.128 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:25.128 "name": "pt2", 00:22:25.128 "aliases": [ 00:22:25.128 "00000000-0000-0000-0000-000000000002" 00:22:25.128 ], 00:22:25.128 "product_name": "passthru", 00:22:25.128 "block_size": 512, 00:22:25.128 "num_blocks": 65536, 00:22:25.128 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:25.128 "assigned_rate_limits": { 00:22:25.128 "rw_ios_per_sec": 0, 00:22:25.128 "rw_mbytes_per_sec": 0, 00:22:25.128 "r_mbytes_per_sec": 0, 00:22:25.128 "w_mbytes_per_sec": 0 00:22:25.128 }, 00:22:25.128 "claimed": true, 00:22:25.128 "claim_type": "exclusive_write", 00:22:25.128 "zoned": false, 00:22:25.128 "supported_io_types": { 00:22:25.128 "read": true, 00:22:25.128 "write": true, 00:22:25.128 "unmap": true, 00:22:25.128 "flush": true, 00:22:25.128 "reset": true, 00:22:25.128 "nvme_admin": false, 00:22:25.128 "nvme_io": false, 00:22:25.128 "nvme_io_md": false, 00:22:25.128 "write_zeroes": true, 00:22:25.128 "zcopy": true, 00:22:25.128 "get_zone_info": false, 00:22:25.128 "zone_management": false, 00:22:25.128 "zone_append": false, 00:22:25.128 "compare": false, 00:22:25.128 "compare_and_write": false, 00:22:25.128 "abort": true, 00:22:25.128 "seek_hole": false, 00:22:25.128 "seek_data": false, 00:22:25.128 "copy": true, 00:22:25.128 "nvme_iov_md": false 00:22:25.128 }, 00:22:25.128 "memory_domains": [ 00:22:25.128 { 00:22:25.128 "dma_device_id": "system", 00:22:25.128 "dma_device_type": 1 00:22:25.128 }, 00:22:25.128 { 00:22:25.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.128 "dma_device_type": 2 00:22:25.128 } 00:22:25.128 ], 00:22:25.128 "driver_specific": { 00:22:25.128 "passthru": { 00:22:25.128 "name": "pt2", 00:22:25.128 "base_bdev_name": "malloc2" 00:22:25.128 } 00:22:25.128 } 00:22:25.128 }' 00:22:25.128 00:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.128 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.128 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:25.128 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.386 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.386 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:25.386 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.386 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.386 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:25.386 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.386 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.646 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:25.646 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:25.646 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:25.646 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:25.646 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:25.646 "name": "pt3", 00:22:25.646 "aliases": [ 00:22:25.646 "00000000-0000-0000-0000-000000000003" 00:22:25.646 ], 00:22:25.646 "product_name": "passthru", 00:22:25.646 "block_size": 512, 00:22:25.646 "num_blocks": 65536, 00:22:25.646 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:25.646 "assigned_rate_limits": { 00:22:25.646 "rw_ios_per_sec": 0, 00:22:25.646 "rw_mbytes_per_sec": 0, 00:22:25.646 "r_mbytes_per_sec": 0, 00:22:25.646 "w_mbytes_per_sec": 0 00:22:25.646 }, 00:22:25.646 "claimed": true, 00:22:25.646 "claim_type": "exclusive_write", 00:22:25.646 "zoned": false, 00:22:25.646 "supported_io_types": { 00:22:25.646 "read": true, 00:22:25.646 "write": true, 00:22:25.646 "unmap": true, 00:22:25.646 "flush": true, 00:22:25.646 "reset": true, 00:22:25.646 "nvme_admin": false, 00:22:25.646 "nvme_io": false, 00:22:25.646 "nvme_io_md": false, 00:22:25.646 "write_zeroes": true, 00:22:25.646 "zcopy": true, 00:22:25.646 "get_zone_info": false, 00:22:25.646 "zone_management": false, 00:22:25.646 "zone_append": false, 00:22:25.646 "compare": false, 00:22:25.646 "compare_and_write": false, 00:22:25.646 "abort": true, 00:22:25.646 "seek_hole": false, 00:22:25.646 "seek_data": false, 00:22:25.646 "copy": true, 00:22:25.646 "nvme_iov_md": false 00:22:25.646 }, 00:22:25.646 "memory_domains": [ 00:22:25.646 { 00:22:25.646 "dma_device_id": "system", 00:22:25.646 "dma_device_type": 1 00:22:25.646 }, 00:22:25.646 { 00:22:25.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.646 "dma_device_type": 2 00:22:25.646 } 00:22:25.646 ], 00:22:25.646 "driver_specific": { 00:22:25.646 "passthru": { 00:22:25.646 "name": "pt3", 00:22:25.646 "base_bdev_name": "malloc3" 00:22:25.646 } 00:22:25.646 } 00:22:25.646 }' 00:22:25.905 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.905 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.905 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:25.905 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.905 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.905 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:25.905 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.905 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.164 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:26.164 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.164 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.164 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:26.164 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:26.164 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:26.164 00:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:26.424 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:26.424 "name": "pt4", 00:22:26.424 "aliases": [ 00:22:26.424 "00000000-0000-0000-0000-000000000004" 00:22:26.424 ], 00:22:26.424 "product_name": "passthru", 00:22:26.424 "block_size": 512, 00:22:26.424 "num_blocks": 65536, 00:22:26.424 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:26.424 "assigned_rate_limits": { 00:22:26.424 "rw_ios_per_sec": 0, 00:22:26.424 "rw_mbytes_per_sec": 0, 00:22:26.424 "r_mbytes_per_sec": 0, 00:22:26.424 "w_mbytes_per_sec": 0 00:22:26.424 }, 00:22:26.424 "claimed": true, 00:22:26.424 "claim_type": "exclusive_write", 00:22:26.424 "zoned": false, 00:22:26.424 "supported_io_types": { 00:22:26.424 "read": true, 00:22:26.424 "write": true, 00:22:26.424 "unmap": true, 00:22:26.424 "flush": true, 00:22:26.424 "reset": true, 00:22:26.424 "nvme_admin": false, 00:22:26.424 "nvme_io": false, 00:22:26.424 "nvme_io_md": false, 00:22:26.424 "write_zeroes": true, 00:22:26.424 "zcopy": true, 00:22:26.424 "get_zone_info": false, 00:22:26.424 "zone_management": false, 00:22:26.424 "zone_append": false, 00:22:26.424 "compare": false, 00:22:26.424 "compare_and_write": false, 00:22:26.424 "abort": true, 00:22:26.424 "seek_hole": false, 00:22:26.424 "seek_data": false, 00:22:26.424 "copy": true, 00:22:26.424 "nvme_iov_md": false 00:22:26.424 }, 00:22:26.424 "memory_domains": [ 00:22:26.424 { 00:22:26.424 "dma_device_id": "system", 00:22:26.424 "dma_device_type": 1 00:22:26.424 }, 00:22:26.424 { 00:22:26.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.424 "dma_device_type": 2 00:22:26.424 } 00:22:26.424 ], 00:22:26.424 "driver_specific": { 00:22:26.424 "passthru": { 00:22:26.424 "name": "pt4", 00:22:26.424 "base_bdev_name": "malloc4" 00:22:26.424 } 00:22:26.424 } 00:22:26.424 }' 00:22:26.424 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.424 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.424 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:26.424 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.424 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.424 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:26.424 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.683 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.683 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:26.683 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.683 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.683 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:26.683 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:26.683 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:26.941 [2024-07-16 00:17:13.776172] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:26.941 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=248e0554-c6c6-4f65-8795-5169d5aed9c1 00:22:26.941 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 248e0554-c6c6-4f65-8795-5169d5aed9c1 ']' 00:22:26.941 00:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:27.200 [2024-07-16 00:17:14.028552] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:27.200 [2024-07-16 00:17:14.028578] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:27.200 [2024-07-16 00:17:14.028634] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:27.200 [2024-07-16 00:17:14.028717] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:27.200 [2024-07-16 00:17:14.028729] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd21530 name raid_bdev1, state offline 00:22:27.200 00:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.200 00:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:27.459 00:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:27.459 00:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:27.459 00:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:27.459 00:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:27.718 00:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:27.718 00:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:27.977 00:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:27.977 00:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:28.237 00:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:28.237 00:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:28.496 00:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:28.496 00:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:28.754 00:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:28.755 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:29.013 [2024-07-16 00:17:15.745010] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:29.013 [2024-07-16 00:17:15.746353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:29.013 [2024-07-16 00:17:15.746397] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:22:29.013 [2024-07-16 00:17:15.746430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:22:29.013 [2024-07-16 00:17:15.746476] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:29.013 [2024-07-16 00:17:15.746522] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:29.013 [2024-07-16 00:17:15.746546] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:22:29.013 [2024-07-16 00:17:15.746568] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:22:29.013 [2024-07-16 00:17:15.746585] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:29.013 [2024-07-16 00:17:15.746595] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeccff0 name raid_bdev1, state configuring 00:22:29.013 request: 00:22:29.013 { 00:22:29.013 "name": "raid_bdev1", 00:22:29.013 "raid_level": "raid1", 00:22:29.013 "base_bdevs": [ 00:22:29.013 "malloc1", 00:22:29.013 "malloc2", 00:22:29.013 "malloc3", 00:22:29.013 "malloc4" 00:22:29.013 ], 00:22:29.013 "superblock": false, 00:22:29.013 "method": "bdev_raid_create", 00:22:29.013 "req_id": 1 00:22:29.013 } 00:22:29.013 Got JSON-RPC error response 00:22:29.013 response: 00:22:29.013 { 00:22:29.013 "code": -17, 00:22:29.013 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:29.013 } 00:22:29.013 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:22:29.013 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:29.013 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:29.013 00:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:29.013 00:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.013 00:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:29.271 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:29.271 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:29.271 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:29.530 [2024-07-16 00:17:16.246270] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:29.530 [2024-07-16 00:17:16.246321] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:29.530 [2024-07-16 00:17:16.246343] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd297a0 00:22:29.530 [2024-07-16 00:17:16.246356] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:29.530 [2024-07-16 00:17:16.248034] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:29.530 [2024-07-16 00:17:16.248064] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:29.530 [2024-07-16 00:17:16.248139] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:29.530 [2024-07-16 00:17:16.248167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:29.530 pt1 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.530 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.788 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:29.788 "name": "raid_bdev1", 00:22:29.788 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:29.788 "strip_size_kb": 0, 00:22:29.788 "state": "configuring", 00:22:29.788 "raid_level": "raid1", 00:22:29.788 "superblock": true, 00:22:29.788 "num_base_bdevs": 4, 00:22:29.788 "num_base_bdevs_discovered": 1, 00:22:29.788 "num_base_bdevs_operational": 4, 00:22:29.788 "base_bdevs_list": [ 00:22:29.788 { 00:22:29.788 "name": "pt1", 00:22:29.788 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:29.788 "is_configured": true, 00:22:29.788 "data_offset": 2048, 00:22:29.788 "data_size": 63488 00:22:29.788 }, 00:22:29.788 { 00:22:29.788 "name": null, 00:22:29.788 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:29.788 "is_configured": false, 00:22:29.788 "data_offset": 2048, 00:22:29.788 "data_size": 63488 00:22:29.788 }, 00:22:29.788 { 00:22:29.788 "name": null, 00:22:29.788 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:29.788 "is_configured": false, 00:22:29.788 "data_offset": 2048, 00:22:29.788 "data_size": 63488 00:22:29.788 }, 00:22:29.788 { 00:22:29.788 "name": null, 00:22:29.788 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:29.788 "is_configured": false, 00:22:29.788 "data_offset": 2048, 00:22:29.788 "data_size": 63488 00:22:29.788 } 00:22:29.788 ] 00:22:29.788 }' 00:22:29.788 00:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:29.788 00:17:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:30.353 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:22:30.353 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:30.611 [2024-07-16 00:17:17.333181] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:30.611 [2024-07-16 00:17:17.333231] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:30.611 [2024-07-16 00:17:17.333250] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec2940 00:22:30.611 [2024-07-16 00:17:17.333262] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:30.611 [2024-07-16 00:17:17.333609] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:30.611 [2024-07-16 00:17:17.333626] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:30.611 [2024-07-16 00:17:17.333690] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:30.611 [2024-07-16 00:17:17.333708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:30.611 pt2 00:22:30.611 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:30.868 [2024-07-16 00:17:17.577850] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.869 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:31.126 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:31.126 "name": "raid_bdev1", 00:22:31.126 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:31.126 "strip_size_kb": 0, 00:22:31.126 "state": "configuring", 00:22:31.126 "raid_level": "raid1", 00:22:31.126 "superblock": true, 00:22:31.126 "num_base_bdevs": 4, 00:22:31.126 "num_base_bdevs_discovered": 1, 00:22:31.126 "num_base_bdevs_operational": 4, 00:22:31.126 "base_bdevs_list": [ 00:22:31.126 { 00:22:31.126 "name": "pt1", 00:22:31.126 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:31.126 "is_configured": true, 00:22:31.126 "data_offset": 2048, 00:22:31.126 "data_size": 63488 00:22:31.126 }, 00:22:31.126 { 00:22:31.126 "name": null, 00:22:31.126 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:31.126 "is_configured": false, 00:22:31.126 "data_offset": 2048, 00:22:31.126 "data_size": 63488 00:22:31.126 }, 00:22:31.126 { 00:22:31.126 "name": null, 00:22:31.126 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:31.126 "is_configured": false, 00:22:31.126 "data_offset": 2048, 00:22:31.126 "data_size": 63488 00:22:31.126 }, 00:22:31.126 { 00:22:31.126 "name": null, 00:22:31.126 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:31.126 "is_configured": false, 00:22:31.126 "data_offset": 2048, 00:22:31.126 "data_size": 63488 00:22:31.126 } 00:22:31.126 ] 00:22:31.126 }' 00:22:31.126 00:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:31.126 00:17:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:31.689 00:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:31.689 00:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:31.689 00:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:31.947 [2024-07-16 00:17:18.656709] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:31.947 [2024-07-16 00:17:18.656755] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:31.947 [2024-07-16 00:17:18.656773] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd20060 00:22:31.947 [2024-07-16 00:17:18.656786] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:31.947 [2024-07-16 00:17:18.657133] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:31.947 [2024-07-16 00:17:18.657152] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:31.947 [2024-07-16 00:17:18.657215] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:31.947 [2024-07-16 00:17:18.657235] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:31.947 pt2 00:22:31.947 00:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:31.947 00:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:31.947 00:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:32.206 [2024-07-16 00:17:18.905387] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:32.206 [2024-07-16 00:17:18.905425] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.206 [2024-07-16 00:17:18.905443] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd228d0 00:22:32.206 [2024-07-16 00:17:18.905455] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.206 [2024-07-16 00:17:18.905757] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.206 [2024-07-16 00:17:18.905773] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:32.206 [2024-07-16 00:17:18.905828] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:32.206 [2024-07-16 00:17:18.905846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:32.206 pt3 00:22:32.206 00:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:32.206 00:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:32.206 00:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:32.466 [2024-07-16 00:17:19.158053] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:32.466 [2024-07-16 00:17:19.158084] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.466 [2024-07-16 00:17:19.158099] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd23b80 00:22:32.466 [2024-07-16 00:17:19.158110] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.466 [2024-07-16 00:17:19.158385] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.466 [2024-07-16 00:17:19.158403] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:32.466 [2024-07-16 00:17:19.158451] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:32.466 [2024-07-16 00:17:19.158468] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:32.466 [2024-07-16 00:17:19.158585] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd20780 00:22:32.466 [2024-07-16 00:17:19.158596] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:32.466 [2024-07-16 00:17:19.158765] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd25fa0 00:22:32.466 [2024-07-16 00:17:19.158898] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd20780 00:22:32.466 [2024-07-16 00:17:19.158908] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd20780 00:22:32.466 [2024-07-16 00:17:19.159011] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:32.466 pt4 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.466 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.725 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.725 "name": "raid_bdev1", 00:22:32.725 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:32.725 "strip_size_kb": 0, 00:22:32.725 "state": "online", 00:22:32.725 "raid_level": "raid1", 00:22:32.725 "superblock": true, 00:22:32.725 "num_base_bdevs": 4, 00:22:32.725 "num_base_bdevs_discovered": 4, 00:22:32.725 "num_base_bdevs_operational": 4, 00:22:32.725 "base_bdevs_list": [ 00:22:32.725 { 00:22:32.725 "name": "pt1", 00:22:32.725 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:32.725 "is_configured": true, 00:22:32.725 "data_offset": 2048, 00:22:32.725 "data_size": 63488 00:22:32.725 }, 00:22:32.725 { 00:22:32.725 "name": "pt2", 00:22:32.725 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:32.725 "is_configured": true, 00:22:32.725 "data_offset": 2048, 00:22:32.725 "data_size": 63488 00:22:32.725 }, 00:22:32.725 { 00:22:32.725 "name": "pt3", 00:22:32.725 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:32.725 "is_configured": true, 00:22:32.725 "data_offset": 2048, 00:22:32.725 "data_size": 63488 00:22:32.725 }, 00:22:32.725 { 00:22:32.725 "name": "pt4", 00:22:32.725 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:32.725 "is_configured": true, 00:22:32.725 "data_offset": 2048, 00:22:32.725 "data_size": 63488 00:22:32.725 } 00:22:32.725 ] 00:22:32.725 }' 00:22:32.725 00:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.725 00:17:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:33.293 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:33.293 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:33.293 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:33.293 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:33.293 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:33.293 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:33.293 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:33.293 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:33.551 [2024-07-16 00:17:20.245265] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:33.551 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:33.551 "name": "raid_bdev1", 00:22:33.551 "aliases": [ 00:22:33.551 "248e0554-c6c6-4f65-8795-5169d5aed9c1" 00:22:33.551 ], 00:22:33.551 "product_name": "Raid Volume", 00:22:33.551 "block_size": 512, 00:22:33.551 "num_blocks": 63488, 00:22:33.551 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:33.551 "assigned_rate_limits": { 00:22:33.551 "rw_ios_per_sec": 0, 00:22:33.551 "rw_mbytes_per_sec": 0, 00:22:33.551 "r_mbytes_per_sec": 0, 00:22:33.551 "w_mbytes_per_sec": 0 00:22:33.551 }, 00:22:33.551 "claimed": false, 00:22:33.551 "zoned": false, 00:22:33.551 "supported_io_types": { 00:22:33.551 "read": true, 00:22:33.551 "write": true, 00:22:33.551 "unmap": false, 00:22:33.551 "flush": false, 00:22:33.551 "reset": true, 00:22:33.551 "nvme_admin": false, 00:22:33.551 "nvme_io": false, 00:22:33.551 "nvme_io_md": false, 00:22:33.551 "write_zeroes": true, 00:22:33.551 "zcopy": false, 00:22:33.551 "get_zone_info": false, 00:22:33.551 "zone_management": false, 00:22:33.551 "zone_append": false, 00:22:33.551 "compare": false, 00:22:33.551 "compare_and_write": false, 00:22:33.551 "abort": false, 00:22:33.551 "seek_hole": false, 00:22:33.551 "seek_data": false, 00:22:33.551 "copy": false, 00:22:33.551 "nvme_iov_md": false 00:22:33.551 }, 00:22:33.551 "memory_domains": [ 00:22:33.551 { 00:22:33.551 "dma_device_id": "system", 00:22:33.551 "dma_device_type": 1 00:22:33.551 }, 00:22:33.551 { 00:22:33.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.551 "dma_device_type": 2 00:22:33.551 }, 00:22:33.551 { 00:22:33.551 "dma_device_id": "system", 00:22:33.552 "dma_device_type": 1 00:22:33.552 }, 00:22:33.552 { 00:22:33.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.552 "dma_device_type": 2 00:22:33.552 }, 00:22:33.552 { 00:22:33.552 "dma_device_id": "system", 00:22:33.552 "dma_device_type": 1 00:22:33.552 }, 00:22:33.552 { 00:22:33.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.552 "dma_device_type": 2 00:22:33.552 }, 00:22:33.552 { 00:22:33.552 "dma_device_id": "system", 00:22:33.552 "dma_device_type": 1 00:22:33.552 }, 00:22:33.552 { 00:22:33.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.552 "dma_device_type": 2 00:22:33.552 } 00:22:33.552 ], 00:22:33.552 "driver_specific": { 00:22:33.552 "raid": { 00:22:33.552 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:33.552 "strip_size_kb": 0, 00:22:33.552 "state": "online", 00:22:33.552 "raid_level": "raid1", 00:22:33.552 "superblock": true, 00:22:33.552 "num_base_bdevs": 4, 00:22:33.552 "num_base_bdevs_discovered": 4, 00:22:33.552 "num_base_bdevs_operational": 4, 00:22:33.552 "base_bdevs_list": [ 00:22:33.552 { 00:22:33.552 "name": "pt1", 00:22:33.552 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:33.552 "is_configured": true, 00:22:33.552 "data_offset": 2048, 00:22:33.552 "data_size": 63488 00:22:33.552 }, 00:22:33.552 { 00:22:33.552 "name": "pt2", 00:22:33.552 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:33.552 "is_configured": true, 00:22:33.552 "data_offset": 2048, 00:22:33.552 "data_size": 63488 00:22:33.552 }, 00:22:33.552 { 00:22:33.552 "name": "pt3", 00:22:33.552 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:33.552 "is_configured": true, 00:22:33.552 "data_offset": 2048, 00:22:33.552 "data_size": 63488 00:22:33.552 }, 00:22:33.552 { 00:22:33.552 "name": "pt4", 00:22:33.552 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:33.552 "is_configured": true, 00:22:33.552 "data_offset": 2048, 00:22:33.552 "data_size": 63488 00:22:33.552 } 00:22:33.552 ] 00:22:33.552 } 00:22:33.552 } 00:22:33.552 }' 00:22:33.552 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:33.552 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:33.552 pt2 00:22:33.552 pt3 00:22:33.552 pt4' 00:22:33.552 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:33.552 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:33.552 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:33.810 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:33.810 "name": "pt1", 00:22:33.810 "aliases": [ 00:22:33.810 "00000000-0000-0000-0000-000000000001" 00:22:33.810 ], 00:22:33.810 "product_name": "passthru", 00:22:33.810 "block_size": 512, 00:22:33.810 "num_blocks": 65536, 00:22:33.810 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:33.810 "assigned_rate_limits": { 00:22:33.810 "rw_ios_per_sec": 0, 00:22:33.810 "rw_mbytes_per_sec": 0, 00:22:33.810 "r_mbytes_per_sec": 0, 00:22:33.810 "w_mbytes_per_sec": 0 00:22:33.810 }, 00:22:33.810 "claimed": true, 00:22:33.810 "claim_type": "exclusive_write", 00:22:33.810 "zoned": false, 00:22:33.810 "supported_io_types": { 00:22:33.810 "read": true, 00:22:33.810 "write": true, 00:22:33.810 "unmap": true, 00:22:33.810 "flush": true, 00:22:33.810 "reset": true, 00:22:33.810 "nvme_admin": false, 00:22:33.810 "nvme_io": false, 00:22:33.810 "nvme_io_md": false, 00:22:33.810 "write_zeroes": true, 00:22:33.810 "zcopy": true, 00:22:33.810 "get_zone_info": false, 00:22:33.810 "zone_management": false, 00:22:33.810 "zone_append": false, 00:22:33.810 "compare": false, 00:22:33.810 "compare_and_write": false, 00:22:33.810 "abort": true, 00:22:33.810 "seek_hole": false, 00:22:33.810 "seek_data": false, 00:22:33.810 "copy": true, 00:22:33.810 "nvme_iov_md": false 00:22:33.810 }, 00:22:33.810 "memory_domains": [ 00:22:33.810 { 00:22:33.810 "dma_device_id": "system", 00:22:33.810 "dma_device_type": 1 00:22:33.810 }, 00:22:33.810 { 00:22:33.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.810 "dma_device_type": 2 00:22:33.810 } 00:22:33.810 ], 00:22:33.810 "driver_specific": { 00:22:33.810 "passthru": { 00:22:33.810 "name": "pt1", 00:22:33.810 "base_bdev_name": "malloc1" 00:22:33.810 } 00:22:33.810 } 00:22:33.810 }' 00:22:33.810 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.810 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.810 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:33.810 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.810 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.810 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:33.810 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:34.068 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:34.068 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:34.068 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:34.068 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:34.068 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:34.068 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:34.068 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:34.068 00:17:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:34.327 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:34.327 "name": "pt2", 00:22:34.327 "aliases": [ 00:22:34.327 "00000000-0000-0000-0000-000000000002" 00:22:34.327 ], 00:22:34.327 "product_name": "passthru", 00:22:34.327 "block_size": 512, 00:22:34.327 "num_blocks": 65536, 00:22:34.327 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:34.327 "assigned_rate_limits": { 00:22:34.327 "rw_ios_per_sec": 0, 00:22:34.327 "rw_mbytes_per_sec": 0, 00:22:34.327 "r_mbytes_per_sec": 0, 00:22:34.327 "w_mbytes_per_sec": 0 00:22:34.327 }, 00:22:34.327 "claimed": true, 00:22:34.327 "claim_type": "exclusive_write", 00:22:34.327 "zoned": false, 00:22:34.327 "supported_io_types": { 00:22:34.327 "read": true, 00:22:34.327 "write": true, 00:22:34.327 "unmap": true, 00:22:34.327 "flush": true, 00:22:34.327 "reset": true, 00:22:34.327 "nvme_admin": false, 00:22:34.327 "nvme_io": false, 00:22:34.327 "nvme_io_md": false, 00:22:34.327 "write_zeroes": true, 00:22:34.327 "zcopy": true, 00:22:34.327 "get_zone_info": false, 00:22:34.327 "zone_management": false, 00:22:34.327 "zone_append": false, 00:22:34.327 "compare": false, 00:22:34.327 "compare_and_write": false, 00:22:34.327 "abort": true, 00:22:34.327 "seek_hole": false, 00:22:34.327 "seek_data": false, 00:22:34.327 "copy": true, 00:22:34.327 "nvme_iov_md": false 00:22:34.327 }, 00:22:34.327 "memory_domains": [ 00:22:34.327 { 00:22:34.327 "dma_device_id": "system", 00:22:34.327 "dma_device_type": 1 00:22:34.327 }, 00:22:34.327 { 00:22:34.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:34.327 "dma_device_type": 2 00:22:34.327 } 00:22:34.327 ], 00:22:34.327 "driver_specific": { 00:22:34.327 "passthru": { 00:22:34.327 "name": "pt2", 00:22:34.327 "base_bdev_name": "malloc2" 00:22:34.327 } 00:22:34.327 } 00:22:34.327 }' 00:22:34.327 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:34.327 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:34.327 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:34.327 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:34.585 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:34.844 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:34.844 "name": "pt3", 00:22:34.844 "aliases": [ 00:22:34.844 "00000000-0000-0000-0000-000000000003" 00:22:34.844 ], 00:22:34.844 "product_name": "passthru", 00:22:34.844 "block_size": 512, 00:22:34.844 "num_blocks": 65536, 00:22:34.844 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:34.844 "assigned_rate_limits": { 00:22:34.844 "rw_ios_per_sec": 0, 00:22:34.844 "rw_mbytes_per_sec": 0, 00:22:34.844 "r_mbytes_per_sec": 0, 00:22:34.844 "w_mbytes_per_sec": 0 00:22:34.844 }, 00:22:34.844 "claimed": true, 00:22:34.844 "claim_type": "exclusive_write", 00:22:34.844 "zoned": false, 00:22:34.844 "supported_io_types": { 00:22:34.844 "read": true, 00:22:34.844 "write": true, 00:22:34.844 "unmap": true, 00:22:34.844 "flush": true, 00:22:34.844 "reset": true, 00:22:34.844 "nvme_admin": false, 00:22:34.844 "nvme_io": false, 00:22:34.844 "nvme_io_md": false, 00:22:34.844 "write_zeroes": true, 00:22:34.844 "zcopy": true, 00:22:34.844 "get_zone_info": false, 00:22:34.844 "zone_management": false, 00:22:34.844 "zone_append": false, 00:22:34.844 "compare": false, 00:22:34.844 "compare_and_write": false, 00:22:34.844 "abort": true, 00:22:34.844 "seek_hole": false, 00:22:34.844 "seek_data": false, 00:22:34.844 "copy": true, 00:22:34.844 "nvme_iov_md": false 00:22:34.844 }, 00:22:34.844 "memory_domains": [ 00:22:34.844 { 00:22:34.844 "dma_device_id": "system", 00:22:34.844 "dma_device_type": 1 00:22:34.844 }, 00:22:34.844 { 00:22:34.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:34.844 "dma_device_type": 2 00:22:34.844 } 00:22:34.844 ], 00:22:34.844 "driver_specific": { 00:22:34.844 "passthru": { 00:22:34.844 "name": "pt3", 00:22:34.844 "base_bdev_name": "malloc3" 00:22:34.844 } 00:22:34.844 } 00:22:34.844 }' 00:22:34.844 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:35.103 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:35.103 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:35.103 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:35.103 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:35.103 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:35.103 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:35.103 00:17:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:35.103 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:35.103 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:35.361 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:35.361 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:35.361 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:35.361 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:35.361 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:35.619 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:35.619 "name": "pt4", 00:22:35.619 "aliases": [ 00:22:35.619 "00000000-0000-0000-0000-000000000004" 00:22:35.619 ], 00:22:35.619 "product_name": "passthru", 00:22:35.619 "block_size": 512, 00:22:35.619 "num_blocks": 65536, 00:22:35.619 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:35.619 "assigned_rate_limits": { 00:22:35.619 "rw_ios_per_sec": 0, 00:22:35.619 "rw_mbytes_per_sec": 0, 00:22:35.619 "r_mbytes_per_sec": 0, 00:22:35.619 "w_mbytes_per_sec": 0 00:22:35.619 }, 00:22:35.619 "claimed": true, 00:22:35.619 "claim_type": "exclusive_write", 00:22:35.619 "zoned": false, 00:22:35.619 "supported_io_types": { 00:22:35.619 "read": true, 00:22:35.619 "write": true, 00:22:35.620 "unmap": true, 00:22:35.620 "flush": true, 00:22:35.620 "reset": true, 00:22:35.620 "nvme_admin": false, 00:22:35.620 "nvme_io": false, 00:22:35.620 "nvme_io_md": false, 00:22:35.620 "write_zeroes": true, 00:22:35.620 "zcopy": true, 00:22:35.620 "get_zone_info": false, 00:22:35.620 "zone_management": false, 00:22:35.620 "zone_append": false, 00:22:35.620 "compare": false, 00:22:35.620 "compare_and_write": false, 00:22:35.620 "abort": true, 00:22:35.620 "seek_hole": false, 00:22:35.620 "seek_data": false, 00:22:35.620 "copy": true, 00:22:35.620 "nvme_iov_md": false 00:22:35.620 }, 00:22:35.620 "memory_domains": [ 00:22:35.620 { 00:22:35.620 "dma_device_id": "system", 00:22:35.620 "dma_device_type": 1 00:22:35.620 }, 00:22:35.620 { 00:22:35.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:35.620 "dma_device_type": 2 00:22:35.620 } 00:22:35.620 ], 00:22:35.620 "driver_specific": { 00:22:35.620 "passthru": { 00:22:35.620 "name": "pt4", 00:22:35.620 "base_bdev_name": "malloc4" 00:22:35.620 } 00:22:35.620 } 00:22:35.620 }' 00:22:35.620 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:35.620 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:35.620 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:35.620 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:35.620 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:35.620 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:35.620 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:35.879 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:35.879 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:35.879 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:35.879 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:35.879 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:35.879 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:35.879 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:36.137 [2024-07-16 00:17:22.888278] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:36.138 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 248e0554-c6c6-4f65-8795-5169d5aed9c1 '!=' 248e0554-c6c6-4f65-8795-5169d5aed9c1 ']' 00:22:36.138 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:36.138 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:36.138 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:36.138 00:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:36.421 [2024-07-16 00:17:23.140664] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.421 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.697 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:36.697 "name": "raid_bdev1", 00:22:36.697 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:36.697 "strip_size_kb": 0, 00:22:36.697 "state": "online", 00:22:36.697 "raid_level": "raid1", 00:22:36.697 "superblock": true, 00:22:36.697 "num_base_bdevs": 4, 00:22:36.697 "num_base_bdevs_discovered": 3, 00:22:36.697 "num_base_bdevs_operational": 3, 00:22:36.697 "base_bdevs_list": [ 00:22:36.697 { 00:22:36.697 "name": null, 00:22:36.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.697 "is_configured": false, 00:22:36.697 "data_offset": 2048, 00:22:36.697 "data_size": 63488 00:22:36.697 }, 00:22:36.697 { 00:22:36.697 "name": "pt2", 00:22:36.697 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:36.697 "is_configured": true, 00:22:36.697 "data_offset": 2048, 00:22:36.697 "data_size": 63488 00:22:36.697 }, 00:22:36.697 { 00:22:36.697 "name": "pt3", 00:22:36.697 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:36.697 "is_configured": true, 00:22:36.697 "data_offset": 2048, 00:22:36.697 "data_size": 63488 00:22:36.697 }, 00:22:36.697 { 00:22:36.697 "name": "pt4", 00:22:36.697 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:36.697 "is_configured": true, 00:22:36.697 "data_offset": 2048, 00:22:36.697 "data_size": 63488 00:22:36.697 } 00:22:36.697 ] 00:22:36.697 }' 00:22:36.697 00:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:36.697 00:17:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:37.264 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:37.523 [2024-07-16 00:17:24.251582] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:37.523 [2024-07-16 00:17:24.251611] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:37.523 [2024-07-16 00:17:24.251665] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:37.523 [2024-07-16 00:17:24.251730] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:37.523 [2024-07-16 00:17:24.251742] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd20780 name raid_bdev1, state offline 00:22:37.523 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.523 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:37.782 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:37.782 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:37.782 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:37.782 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:37.782 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:38.041 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:38.041 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:38.041 00:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:38.300 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:38.300 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:38.300 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:38.559 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:38.559 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:38.559 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:38.559 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:38.559 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:38.559 [2024-07-16 00:17:25.490824] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:38.559 [2024-07-16 00:17:25.490870] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:38.559 [2024-07-16 00:17:25.490889] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec3700 00:22:38.559 [2024-07-16 00:17:25.490902] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:38.559 [2024-07-16 00:17:25.492515] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:38.559 [2024-07-16 00:17:25.492545] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:38.559 [2024-07-16 00:17:25.492609] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:38.559 [2024-07-16 00:17:25.492636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:38.559 pt2 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:38.818 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:38.818 "name": "raid_bdev1", 00:22:38.818 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:38.818 "strip_size_kb": 0, 00:22:38.818 "state": "configuring", 00:22:38.818 "raid_level": "raid1", 00:22:38.818 "superblock": true, 00:22:38.818 "num_base_bdevs": 4, 00:22:38.819 "num_base_bdevs_discovered": 1, 00:22:38.819 "num_base_bdevs_operational": 3, 00:22:38.819 "base_bdevs_list": [ 00:22:38.819 { 00:22:38.819 "name": null, 00:22:38.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:38.819 "is_configured": false, 00:22:38.819 "data_offset": 2048, 00:22:38.819 "data_size": 63488 00:22:38.819 }, 00:22:38.819 { 00:22:38.819 "name": "pt2", 00:22:38.819 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:38.819 "is_configured": true, 00:22:38.819 "data_offset": 2048, 00:22:38.819 "data_size": 63488 00:22:38.819 }, 00:22:38.819 { 00:22:38.819 "name": null, 00:22:38.819 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:38.819 "is_configured": false, 00:22:38.819 "data_offset": 2048, 00:22:38.819 "data_size": 63488 00:22:38.819 }, 00:22:38.819 { 00:22:38.819 "name": null, 00:22:38.819 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:38.819 "is_configured": false, 00:22:38.819 "data_offset": 2048, 00:22:38.819 "data_size": 63488 00:22:38.819 } 00:22:38.819 ] 00:22:38.819 }' 00:22:38.819 00:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:38.819 00:17:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:39.756 [2024-07-16 00:17:26.529602] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:39.756 [2024-07-16 00:17:26.529652] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:39.756 [2024-07-16 00:17:26.529675] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd29a10 00:22:39.756 [2024-07-16 00:17:26.529687] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:39.756 [2024-07-16 00:17:26.530045] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:39.756 [2024-07-16 00:17:26.530062] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:39.756 [2024-07-16 00:17:26.530126] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:39.756 [2024-07-16 00:17:26.530144] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:39.756 pt3 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.756 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.015 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.015 "name": "raid_bdev1", 00:22:40.015 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:40.015 "strip_size_kb": 0, 00:22:40.015 "state": "configuring", 00:22:40.015 "raid_level": "raid1", 00:22:40.015 "superblock": true, 00:22:40.015 "num_base_bdevs": 4, 00:22:40.015 "num_base_bdevs_discovered": 2, 00:22:40.015 "num_base_bdevs_operational": 3, 00:22:40.015 "base_bdevs_list": [ 00:22:40.015 { 00:22:40.015 "name": null, 00:22:40.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.015 "is_configured": false, 00:22:40.015 "data_offset": 2048, 00:22:40.015 "data_size": 63488 00:22:40.015 }, 00:22:40.015 { 00:22:40.015 "name": "pt2", 00:22:40.015 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:40.015 "is_configured": true, 00:22:40.015 "data_offset": 2048, 00:22:40.015 "data_size": 63488 00:22:40.015 }, 00:22:40.015 { 00:22:40.015 "name": "pt3", 00:22:40.015 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:40.015 "is_configured": true, 00:22:40.015 "data_offset": 2048, 00:22:40.015 "data_size": 63488 00:22:40.015 }, 00:22:40.015 { 00:22:40.015 "name": null, 00:22:40.015 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:40.015 "is_configured": false, 00:22:40.015 "data_offset": 2048, 00:22:40.015 "data_size": 63488 00:22:40.015 } 00:22:40.015 ] 00:22:40.015 }' 00:22:40.015 00:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.015 00:17:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:40.582 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:40.582 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:40.582 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:22:40.583 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:40.841 [2024-07-16 00:17:27.636616] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:40.841 [2024-07-16 00:17:27.636665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.841 [2024-07-16 00:17:27.636684] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xecc520 00:22:40.841 [2024-07-16 00:17:27.636696] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.841 [2024-07-16 00:17:27.637051] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.841 [2024-07-16 00:17:27.637069] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:40.841 [2024-07-16 00:17:27.637131] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:40.841 [2024-07-16 00:17:27.637151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:40.841 [2024-07-16 00:17:27.637262] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd20ea0 00:22:40.841 [2024-07-16 00:17:27.637272] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:40.841 [2024-07-16 00:17:27.637439] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd25600 00:22:40.841 [2024-07-16 00:17:27.637568] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd20ea0 00:22:40.841 [2024-07-16 00:17:27.637578] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd20ea0 00:22:40.841 [2024-07-16 00:17:27.637672] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:40.841 pt4 00:22:40.841 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:40.841 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:40.842 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:40.842 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:40.842 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:40.842 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:40.842 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:40.842 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:40.842 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:40.842 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:40.842 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.842 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.100 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:41.101 "name": "raid_bdev1", 00:22:41.101 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:41.101 "strip_size_kb": 0, 00:22:41.101 "state": "online", 00:22:41.101 "raid_level": "raid1", 00:22:41.101 "superblock": true, 00:22:41.101 "num_base_bdevs": 4, 00:22:41.101 "num_base_bdevs_discovered": 3, 00:22:41.101 "num_base_bdevs_operational": 3, 00:22:41.101 "base_bdevs_list": [ 00:22:41.101 { 00:22:41.101 "name": null, 00:22:41.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.101 "is_configured": false, 00:22:41.101 "data_offset": 2048, 00:22:41.101 "data_size": 63488 00:22:41.101 }, 00:22:41.101 { 00:22:41.101 "name": "pt2", 00:22:41.101 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:41.101 "is_configured": true, 00:22:41.101 "data_offset": 2048, 00:22:41.101 "data_size": 63488 00:22:41.101 }, 00:22:41.101 { 00:22:41.101 "name": "pt3", 00:22:41.101 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:41.101 "is_configured": true, 00:22:41.101 "data_offset": 2048, 00:22:41.101 "data_size": 63488 00:22:41.101 }, 00:22:41.101 { 00:22:41.101 "name": "pt4", 00:22:41.101 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:41.101 "is_configured": true, 00:22:41.101 "data_offset": 2048, 00:22:41.101 "data_size": 63488 00:22:41.101 } 00:22:41.101 ] 00:22:41.101 }' 00:22:41.101 00:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:41.101 00:17:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:41.669 00:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:41.928 [2024-07-16 00:17:28.747564] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:41.929 [2024-07-16 00:17:28.747591] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:41.929 [2024-07-16 00:17:28.747641] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:41.929 [2024-07-16 00:17:28.747707] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:41.929 [2024-07-16 00:17:28.747718] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd20ea0 name raid_bdev1, state offline 00:22:41.929 00:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.929 00:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:42.188 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:42.188 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:42.188 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:22:42.188 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:22:42.188 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:42.448 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:42.707 [2024-07-16 00:17:29.493501] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:42.707 [2024-07-16 00:17:29.493550] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:42.707 [2024-07-16 00:17:29.493568] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xecc520 00:22:42.707 [2024-07-16 00:17:29.493581] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:42.707 [2024-07-16 00:17:29.495204] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:42.707 [2024-07-16 00:17:29.495236] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:42.707 [2024-07-16 00:17:29.495304] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:42.707 [2024-07-16 00:17:29.495330] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:42.707 [2024-07-16 00:17:29.495433] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:42.707 [2024-07-16 00:17:29.495446] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:42.707 [2024-07-16 00:17:29.495460] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd20060 name raid_bdev1, state configuring 00:22:42.707 [2024-07-16 00:17:29.495483] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:42.707 [2024-07-16 00:17:29.495559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:42.707 pt1 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.707 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.966 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.966 "name": "raid_bdev1", 00:22:42.966 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:42.966 "strip_size_kb": 0, 00:22:42.966 "state": "configuring", 00:22:42.966 "raid_level": "raid1", 00:22:42.966 "superblock": true, 00:22:42.966 "num_base_bdevs": 4, 00:22:42.966 "num_base_bdevs_discovered": 2, 00:22:42.966 "num_base_bdevs_operational": 3, 00:22:42.966 "base_bdevs_list": [ 00:22:42.966 { 00:22:42.966 "name": null, 00:22:42.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.966 "is_configured": false, 00:22:42.966 "data_offset": 2048, 00:22:42.966 "data_size": 63488 00:22:42.966 }, 00:22:42.966 { 00:22:42.966 "name": "pt2", 00:22:42.966 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:42.966 "is_configured": true, 00:22:42.966 "data_offset": 2048, 00:22:42.966 "data_size": 63488 00:22:42.966 }, 00:22:42.966 { 00:22:42.966 "name": "pt3", 00:22:42.966 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:42.966 "is_configured": true, 00:22:42.966 "data_offset": 2048, 00:22:42.966 "data_size": 63488 00:22:42.966 }, 00:22:42.966 { 00:22:42.966 "name": null, 00:22:42.966 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:42.966 "is_configured": false, 00:22:42.966 "data_offset": 2048, 00:22:42.966 "data_size": 63488 00:22:42.966 } 00:22:42.966 ] 00:22:42.966 }' 00:22:42.966 00:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.966 00:17:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:43.534 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:22:43.534 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:43.793 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:22:43.793 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:44.053 [2024-07-16 00:17:30.841147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:44.053 [2024-07-16 00:17:30.841203] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:44.053 [2024-07-16 00:17:30.841222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd20310 00:22:44.053 [2024-07-16 00:17:30.841235] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:44.053 [2024-07-16 00:17:30.841592] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:44.053 [2024-07-16 00:17:30.841610] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:44.053 [2024-07-16 00:17:30.841675] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:44.053 [2024-07-16 00:17:30.841695] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:44.053 [2024-07-16 00:17:30.841809] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd23b40 00:22:44.053 [2024-07-16 00:17:30.841820] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:44.053 [2024-07-16 00:17:30.842007] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xec3990 00:22:44.053 [2024-07-16 00:17:30.842149] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd23b40 00:22:44.053 [2024-07-16 00:17:30.842159] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd23b40 00:22:44.053 [2024-07-16 00:17:30.842254] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:44.053 pt4 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.053 00:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.313 00:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.313 "name": "raid_bdev1", 00:22:44.313 "uuid": "248e0554-c6c6-4f65-8795-5169d5aed9c1", 00:22:44.313 "strip_size_kb": 0, 00:22:44.313 "state": "online", 00:22:44.313 "raid_level": "raid1", 00:22:44.313 "superblock": true, 00:22:44.313 "num_base_bdevs": 4, 00:22:44.313 "num_base_bdevs_discovered": 3, 00:22:44.313 "num_base_bdevs_operational": 3, 00:22:44.313 "base_bdevs_list": [ 00:22:44.313 { 00:22:44.313 "name": null, 00:22:44.313 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.313 "is_configured": false, 00:22:44.313 "data_offset": 2048, 00:22:44.313 "data_size": 63488 00:22:44.313 }, 00:22:44.313 { 00:22:44.313 "name": "pt2", 00:22:44.313 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:44.313 "is_configured": true, 00:22:44.313 "data_offset": 2048, 00:22:44.313 "data_size": 63488 00:22:44.313 }, 00:22:44.313 { 00:22:44.313 "name": "pt3", 00:22:44.313 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:44.313 "is_configured": true, 00:22:44.313 "data_offset": 2048, 00:22:44.313 "data_size": 63488 00:22:44.313 }, 00:22:44.313 { 00:22:44.313 "name": "pt4", 00:22:44.313 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:44.313 "is_configured": true, 00:22:44.313 "data_offset": 2048, 00:22:44.313 "data_size": 63488 00:22:44.313 } 00:22:44.313 ] 00:22:44.313 }' 00:22:44.313 00:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.313 00:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:44.881 00:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:44.881 00:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:45.140 00:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:45.140 00:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:45.140 00:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:45.397 [2024-07-16 00:17:32.197042] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 248e0554-c6c6-4f65-8795-5169d5aed9c1 '!=' 248e0554-c6c6-4f65-8795-5169d5aed9c1 ']' 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3591694 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3591694 ']' 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3591694 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3591694 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3591694' 00:22:45.397 killing process with pid 3591694 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3591694 00:22:45.397 [2024-07-16 00:17:32.268138] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:45.397 [2024-07-16 00:17:32.268194] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:45.397 [2024-07-16 00:17:32.268259] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:45.397 [2024-07-16 00:17:32.268271] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd23b40 name raid_bdev1, state offline 00:22:45.397 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3591694 00:22:45.397 [2024-07-16 00:17:32.310909] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:45.655 00:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:45.655 00:22:45.655 real 0m25.820s 00:22:45.655 user 0m47.189s 00:22:45.655 sys 0m4.686s 00:22:45.655 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:45.655 00:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:45.655 ************************************ 00:22:45.655 END TEST raid_superblock_test 00:22:45.655 ************************************ 00:22:45.655 00:17:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:45.655 00:17:32 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:22:45.655 00:17:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:45.655 00:17:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:45.655 00:17:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:45.914 ************************************ 00:22:45.914 START TEST raid_read_error_test 00:22:45.914 ************************************ 00:22:45.914 00:17:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:22:45.914 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:45.914 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:45.914 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.yCKgXbRmMe 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3595544 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3595544 /var/tmp/spdk-raid.sock 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3595544 ']' 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:45.915 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:45.915 00:17:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:45.915 [2024-07-16 00:17:32.711670] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:22:45.915 [2024-07-16 00:17:32.711739] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3595544 ] 00:22:45.915 [2024-07-16 00:17:32.841388] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:46.173 [2024-07-16 00:17:32.944514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:46.173 [2024-07-16 00:17:33.002901] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.173 [2024-07-16 00:17:33.002948] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.739 00:17:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:46.739 00:17:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:46.739 00:17:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:46.739 00:17:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:46.997 BaseBdev1_malloc 00:22:46.997 00:17:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:47.255 true 00:22:47.255 00:17:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:47.514 [2024-07-16 00:17:34.356367] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:47.514 [2024-07-16 00:17:34.356410] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.514 [2024-07-16 00:17:34.356436] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a40d0 00:22:47.514 [2024-07-16 00:17:34.356449] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.514 [2024-07-16 00:17:34.358303] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.514 [2024-07-16 00:17:34.358332] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:47.514 BaseBdev1 00:22:47.514 00:17:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:47.514 00:17:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:47.772 BaseBdev2_malloc 00:22:47.772 00:17:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:48.030 true 00:22:48.030 00:17:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:48.288 [2024-07-16 00:17:35.094903] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:48.288 [2024-07-16 00:17:35.094956] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:48.288 [2024-07-16 00:17:35.094977] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a8910 00:22:48.288 [2024-07-16 00:17:35.094989] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:48.288 [2024-07-16 00:17:35.096618] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:48.288 [2024-07-16 00:17:35.096647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:48.288 BaseBdev2 00:22:48.288 00:17:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:48.288 00:17:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:48.546 BaseBdev3_malloc 00:22:48.546 00:17:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:48.804 true 00:22:48.804 00:17:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:49.063 [2024-07-16 00:17:35.846699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:49.063 [2024-07-16 00:17:35.846744] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:49.063 [2024-07-16 00:17:35.846765] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9aabd0 00:22:49.063 [2024-07-16 00:17:35.846777] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:49.063 [2024-07-16 00:17:35.848391] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:49.063 [2024-07-16 00:17:35.848433] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:49.063 BaseBdev3 00:22:49.063 00:17:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:49.063 00:17:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:49.322 BaseBdev4_malloc 00:22:49.322 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:49.580 true 00:22:49.580 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:49.839 [2024-07-16 00:17:36.542381] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:49.839 [2024-07-16 00:17:36.542427] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:49.839 [2024-07-16 00:17:36.542447] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9abaa0 00:22:49.839 [2024-07-16 00:17:36.542460] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:49.839 [2024-07-16 00:17:36.543903] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:49.839 [2024-07-16 00:17:36.543939] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:49.839 BaseBdev4 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:49.839 [2024-07-16 00:17:36.722886] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:49.839 [2024-07-16 00:17:36.724124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:49.839 [2024-07-16 00:17:36.724191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:49.839 [2024-07-16 00:17:36.724253] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:49.839 [2024-07-16 00:17:36.724490] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9a5c20 00:22:49.839 [2024-07-16 00:17:36.724501] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:49.839 [2024-07-16 00:17:36.724682] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7fa260 00:22:49.839 [2024-07-16 00:17:36.724831] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9a5c20 00:22:49.839 [2024-07-16 00:17:36.724841] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9a5c20 00:22:49.839 [2024-07-16 00:17:36.724949] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.839 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:50.098 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:50.098 "name": "raid_bdev1", 00:22:50.098 "uuid": "0647f82f-00e9-4134-8628-9489d25bf577", 00:22:50.098 "strip_size_kb": 0, 00:22:50.098 "state": "online", 00:22:50.098 "raid_level": "raid1", 00:22:50.098 "superblock": true, 00:22:50.098 "num_base_bdevs": 4, 00:22:50.098 "num_base_bdevs_discovered": 4, 00:22:50.098 "num_base_bdevs_operational": 4, 00:22:50.098 "base_bdevs_list": [ 00:22:50.098 { 00:22:50.098 "name": "BaseBdev1", 00:22:50.098 "uuid": "e25e112c-7349-51a1-be4f-865d5f1fbb50", 00:22:50.098 "is_configured": true, 00:22:50.098 "data_offset": 2048, 00:22:50.098 "data_size": 63488 00:22:50.098 }, 00:22:50.098 { 00:22:50.098 "name": "BaseBdev2", 00:22:50.098 "uuid": "9a80943c-f794-5600-a8d4-943b36a43164", 00:22:50.098 "is_configured": true, 00:22:50.098 "data_offset": 2048, 00:22:50.098 "data_size": 63488 00:22:50.098 }, 00:22:50.098 { 00:22:50.098 "name": "BaseBdev3", 00:22:50.098 "uuid": "9b78de23-b75d-5054-b428-f2c8301e1d9f", 00:22:50.098 "is_configured": true, 00:22:50.098 "data_offset": 2048, 00:22:50.098 "data_size": 63488 00:22:50.098 }, 00:22:50.098 { 00:22:50.098 "name": "BaseBdev4", 00:22:50.098 "uuid": "7815259d-2ab5-5a0c-b168-ba268f9c7317", 00:22:50.098 "is_configured": true, 00:22:50.098 "data_offset": 2048, 00:22:50.098 "data_size": 63488 00:22:50.098 } 00:22:50.098 ] 00:22:50.098 }' 00:22:50.098 00:17:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:50.098 00:17:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:50.687 00:17:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:50.687 00:17:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:50.687 [2024-07-16 00:17:37.569399] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7f9c60 00:22:51.676 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.934 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.194 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.194 "name": "raid_bdev1", 00:22:52.194 "uuid": "0647f82f-00e9-4134-8628-9489d25bf577", 00:22:52.194 "strip_size_kb": 0, 00:22:52.194 "state": "online", 00:22:52.194 "raid_level": "raid1", 00:22:52.194 "superblock": true, 00:22:52.194 "num_base_bdevs": 4, 00:22:52.194 "num_base_bdevs_discovered": 4, 00:22:52.194 "num_base_bdevs_operational": 4, 00:22:52.194 "base_bdevs_list": [ 00:22:52.194 { 00:22:52.194 "name": "BaseBdev1", 00:22:52.194 "uuid": "e25e112c-7349-51a1-be4f-865d5f1fbb50", 00:22:52.194 "is_configured": true, 00:22:52.194 "data_offset": 2048, 00:22:52.194 "data_size": 63488 00:22:52.194 }, 00:22:52.194 { 00:22:52.194 "name": "BaseBdev2", 00:22:52.194 "uuid": "9a80943c-f794-5600-a8d4-943b36a43164", 00:22:52.194 "is_configured": true, 00:22:52.194 "data_offset": 2048, 00:22:52.194 "data_size": 63488 00:22:52.194 }, 00:22:52.194 { 00:22:52.194 "name": "BaseBdev3", 00:22:52.194 "uuid": "9b78de23-b75d-5054-b428-f2c8301e1d9f", 00:22:52.194 "is_configured": true, 00:22:52.194 "data_offset": 2048, 00:22:52.194 "data_size": 63488 00:22:52.194 }, 00:22:52.194 { 00:22:52.194 "name": "BaseBdev4", 00:22:52.194 "uuid": "7815259d-2ab5-5a0c-b168-ba268f9c7317", 00:22:52.194 "is_configured": true, 00:22:52.194 "data_offset": 2048, 00:22:52.194 "data_size": 63488 00:22:52.194 } 00:22:52.194 ] 00:22:52.194 }' 00:22:52.194 00:17:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.194 00:17:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:52.762 00:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:53.022 [2024-07-16 00:17:39.788146] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:53.022 [2024-07-16 00:17:39.788186] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:53.022 [2024-07-16 00:17:39.791421] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:53.022 [2024-07-16 00:17:39.791459] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:53.022 [2024-07-16 00:17:39.791578] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:53.022 [2024-07-16 00:17:39.791590] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a5c20 name raid_bdev1, state offline 00:22:53.022 0 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3595544 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3595544 ']' 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3595544 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3595544 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3595544' 00:22:53.022 killing process with pid 3595544 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3595544 00:22:53.022 [2024-07-16 00:17:39.874316] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:53.022 00:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3595544 00:22:53.022 [2024-07-16 00:17:39.906451] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:53.282 00:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.yCKgXbRmMe 00:22:53.282 00:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:53.282 00:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:53.282 00:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:53.282 00:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:53.282 00:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:53.282 00:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:53.282 00:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:53.282 00:22:53.282 real 0m7.513s 00:22:53.282 user 0m11.972s 00:22:53.282 sys 0m1.358s 00:22:53.282 00:17:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:53.282 00:17:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:53.282 ************************************ 00:22:53.282 END TEST raid_read_error_test 00:22:53.282 ************************************ 00:22:53.282 00:17:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:53.282 00:17:40 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:22:53.282 00:17:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:53.282 00:17:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:53.282 00:17:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:53.542 ************************************ 00:22:53.542 START TEST raid_write_error_test 00:22:53.542 ************************************ 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:53.542 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.kQ5Chh1bGI 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3596528 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3596528 /var/tmp/spdk-raid.sock 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3596528 ']' 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:53.543 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:53.543 00:17:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:53.543 [2024-07-16 00:17:40.316611] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:22:53.543 [2024-07-16 00:17:40.316680] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3596528 ] 00:22:53.543 [2024-07-16 00:17:40.448378] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:53.802 [2024-07-16 00:17:40.553939] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:53.802 [2024-07-16 00:17:40.625165] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:53.802 [2024-07-16 00:17:40.625214] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:54.371 00:17:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:54.371 00:17:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:54.371 00:17:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:54.371 00:17:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:54.630 BaseBdev1_malloc 00:22:54.630 00:17:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:54.890 true 00:22:54.890 00:17:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:55.148 [2024-07-16 00:17:41.920180] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:55.148 [2024-07-16 00:17:41.920224] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.148 [2024-07-16 00:17:41.920245] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa450d0 00:22:55.148 [2024-07-16 00:17:41.920258] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.148 [2024-07-16 00:17:41.922054] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.148 [2024-07-16 00:17:41.922084] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:55.148 BaseBdev1 00:22:55.148 00:17:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:55.148 00:17:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:55.407 BaseBdev2_malloc 00:22:55.407 00:17:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:55.665 true 00:22:55.665 00:17:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:55.924 [2024-07-16 00:17:42.670830] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:55.924 [2024-07-16 00:17:42.670877] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.924 [2024-07-16 00:17:42.670896] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa49910 00:22:55.924 [2024-07-16 00:17:42.670909] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.924 [2024-07-16 00:17:42.672338] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.924 [2024-07-16 00:17:42.672366] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:55.924 BaseBdev2 00:22:55.924 00:17:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:55.924 00:17:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:56.183 BaseBdev3_malloc 00:22:56.183 00:17:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:56.441 true 00:22:56.441 00:17:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:56.699 [2024-07-16 00:17:43.401331] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:56.700 [2024-07-16 00:17:43.401375] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:56.700 [2024-07-16 00:17:43.401399] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa4bbd0 00:22:56.700 [2024-07-16 00:17:43.401412] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:56.700 [2024-07-16 00:17:43.402807] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:56.700 [2024-07-16 00:17:43.402834] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:56.700 BaseBdev3 00:22:56.700 00:17:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:56.700 00:17:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:56.958 BaseBdev4_malloc 00:22:56.958 00:17:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:56.958 true 00:22:57.217 00:17:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:57.217 [2024-07-16 00:17:44.151875] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:57.217 [2024-07-16 00:17:44.151919] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:57.217 [2024-07-16 00:17:44.151945] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa4caa0 00:22:57.217 [2024-07-16 00:17:44.151958] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:57.217 [2024-07-16 00:17:44.153325] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:57.217 [2024-07-16 00:17:44.153350] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:57.217 BaseBdev4 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:57.476 [2024-07-16 00:17:44.392540] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:57.476 [2024-07-16 00:17:44.393711] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:57.476 [2024-07-16 00:17:44.393775] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:57.476 [2024-07-16 00:17:44.393835] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:57.476 [2024-07-16 00:17:44.394069] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa46c20 00:22:57.476 [2024-07-16 00:17:44.394081] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:57.476 [2024-07-16 00:17:44.394251] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x89b260 00:22:57.476 [2024-07-16 00:17:44.394397] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa46c20 00:22:57.476 [2024-07-16 00:17:44.394407] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa46c20 00:22:57.476 [2024-07-16 00:17:44.394504] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.476 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.734 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:57.734 "name": "raid_bdev1", 00:22:57.734 "uuid": "86683b45-98ba-4233-be88-99e192f68dba", 00:22:57.734 "strip_size_kb": 0, 00:22:57.734 "state": "online", 00:22:57.734 "raid_level": "raid1", 00:22:57.734 "superblock": true, 00:22:57.734 "num_base_bdevs": 4, 00:22:57.734 "num_base_bdevs_discovered": 4, 00:22:57.734 "num_base_bdevs_operational": 4, 00:22:57.734 "base_bdevs_list": [ 00:22:57.734 { 00:22:57.734 "name": "BaseBdev1", 00:22:57.734 "uuid": "f668ad1b-8b44-5f64-be86-0f51c00a7ac8", 00:22:57.734 "is_configured": true, 00:22:57.734 "data_offset": 2048, 00:22:57.734 "data_size": 63488 00:22:57.734 }, 00:22:57.734 { 00:22:57.734 "name": "BaseBdev2", 00:22:57.734 "uuid": "f0ace5bc-10af-5a9a-be8a-b7246cf2600a", 00:22:57.734 "is_configured": true, 00:22:57.734 "data_offset": 2048, 00:22:57.734 "data_size": 63488 00:22:57.734 }, 00:22:57.734 { 00:22:57.734 "name": "BaseBdev3", 00:22:57.734 "uuid": "f4a2a300-8671-5db0-b2f2-d16045e06367", 00:22:57.734 "is_configured": true, 00:22:57.734 "data_offset": 2048, 00:22:57.734 "data_size": 63488 00:22:57.734 }, 00:22:57.734 { 00:22:57.734 "name": "BaseBdev4", 00:22:57.734 "uuid": "894571d6-b18b-5366-9a8e-379ea58561f0", 00:22:57.734 "is_configured": true, 00:22:57.734 "data_offset": 2048, 00:22:57.734 "data_size": 63488 00:22:57.734 } 00:22:57.734 ] 00:22:57.734 }' 00:22:57.734 00:17:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:57.734 00:17:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:58.672 00:17:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:58.672 00:17:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:58.672 [2024-07-16 00:17:45.383471] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x89ac60 00:22:59.609 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:59.609 [2024-07-16 00:17:46.508251] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:22:59.609 [2024-07-16 00:17:46.508315] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:59.609 [2024-07-16 00:17:46.508527] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x89ac60 00:22:59.609 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:59.609 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:59.609 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.610 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.868 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:59.868 "name": "raid_bdev1", 00:22:59.868 "uuid": "86683b45-98ba-4233-be88-99e192f68dba", 00:22:59.868 "strip_size_kb": 0, 00:22:59.868 "state": "online", 00:22:59.868 "raid_level": "raid1", 00:22:59.868 "superblock": true, 00:22:59.868 "num_base_bdevs": 4, 00:22:59.868 "num_base_bdevs_discovered": 3, 00:22:59.868 "num_base_bdevs_operational": 3, 00:22:59.868 "base_bdevs_list": [ 00:22:59.868 { 00:22:59.868 "name": null, 00:22:59.868 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.868 "is_configured": false, 00:22:59.868 "data_offset": 2048, 00:22:59.868 "data_size": 63488 00:22:59.868 }, 00:22:59.868 { 00:22:59.868 "name": "BaseBdev2", 00:22:59.868 "uuid": "f0ace5bc-10af-5a9a-be8a-b7246cf2600a", 00:22:59.868 "is_configured": true, 00:22:59.868 "data_offset": 2048, 00:22:59.868 "data_size": 63488 00:22:59.868 }, 00:22:59.868 { 00:22:59.868 "name": "BaseBdev3", 00:22:59.868 "uuid": "f4a2a300-8671-5db0-b2f2-d16045e06367", 00:22:59.868 "is_configured": true, 00:22:59.868 "data_offset": 2048, 00:22:59.868 "data_size": 63488 00:22:59.868 }, 00:22:59.868 { 00:22:59.868 "name": "BaseBdev4", 00:22:59.868 "uuid": "894571d6-b18b-5366-9a8e-379ea58561f0", 00:22:59.868 "is_configured": true, 00:22:59.868 "data_offset": 2048, 00:22:59.868 "data_size": 63488 00:22:59.868 } 00:22:59.868 ] 00:22:59.868 }' 00:22:59.868 00:17:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:59.868 00:17:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:00.435 00:17:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:01.004 [2024-07-16 00:17:47.803216] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:01.004 [2024-07-16 00:17:47.803257] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:01.004 [2024-07-16 00:17:47.806498] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:01.004 [2024-07-16 00:17:47.806534] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:01.004 [2024-07-16 00:17:47.806630] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:01.004 [2024-07-16 00:17:47.806642] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa46c20 name raid_bdev1, state offline 00:23:01.004 0 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3596528 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3596528 ']' 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3596528 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3596528 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3596528' 00:23:01.004 killing process with pid 3596528 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3596528 00:23:01.004 [2024-07-16 00:17:47.887608] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:01.004 00:17:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3596528 00:23:01.004 [2024-07-16 00:17:47.923900] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:01.263 00:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.kQ5Chh1bGI 00:23:01.263 00:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:01.263 00:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:01.263 00:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:01.263 00:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:01.264 00:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:01.264 00:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:01.264 00:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:01.264 00:23:01.264 real 0m7.934s 00:23:01.264 user 0m12.730s 00:23:01.264 sys 0m1.440s 00:23:01.264 00:17:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:01.264 00:17:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:01.264 ************************************ 00:23:01.264 END TEST raid_write_error_test 00:23:01.264 ************************************ 00:23:01.522 00:17:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:01.522 00:17:48 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:23:01.522 00:17:48 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:23:01.522 00:17:48 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:23:01.522 00:17:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:01.522 00:17:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:01.522 00:17:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:01.522 ************************************ 00:23:01.522 START TEST raid_rebuild_test 00:23:01.522 ************************************ 00:23:01.522 00:17:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=3597675 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 3597675 /var/tmp/spdk-raid.sock 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 3597675 ']' 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:01.523 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:01.523 00:17:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:01.523 [2024-07-16 00:17:48.325982] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:23:01.523 [2024-07-16 00:17:48.326052] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3597675 ] 00:23:01.523 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:01.523 Zero copy mechanism will not be used. 00:23:01.523 [2024-07-16 00:17:48.456206] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:01.781 [2024-07-16 00:17:48.558288] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:01.782 [2024-07-16 00:17:48.625347] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:01.782 [2024-07-16 00:17:48.625385] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:02.349 00:17:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:02.349 00:17:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:23:02.349 00:17:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:02.349 00:17:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:02.608 BaseBdev1_malloc 00:23:02.608 00:17:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:02.608 [2024-07-16 00:17:49.528378] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:02.608 [2024-07-16 00:17:49.528429] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:02.608 [2024-07-16 00:17:49.528449] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11e5d40 00:23:02.608 [2024-07-16 00:17:49.528462] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:02.608 [2024-07-16 00:17:49.530014] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:02.608 [2024-07-16 00:17:49.530040] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:02.608 BaseBdev1 00:23:02.608 00:17:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:02.608 00:17:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:02.867 BaseBdev2_malloc 00:23:02.867 00:17:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:03.126 [2024-07-16 00:17:49.886063] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:03.126 [2024-07-16 00:17:49.886102] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.126 [2024-07-16 00:17:49.886123] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11e6860 00:23:03.126 [2024-07-16 00:17:49.886136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.126 [2024-07-16 00:17:49.887489] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.126 [2024-07-16 00:17:49.887517] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:03.126 BaseBdev2 00:23:03.126 00:17:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:03.385 spare_malloc 00:23:03.385 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:03.385 spare_delay 00:23:03.643 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:03.643 [2024-07-16 00:17:50.508437] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:03.643 [2024-07-16 00:17:50.508483] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.643 [2024-07-16 00:17:50.508502] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1394ec0 00:23:03.643 [2024-07-16 00:17:50.508514] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.643 [2024-07-16 00:17:50.509917] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.643 [2024-07-16 00:17:50.509953] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:03.643 spare 00:23:03.643 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:03.902 [2024-07-16 00:17:50.688935] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:03.902 [2024-07-16 00:17:50.690108] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:03.902 [2024-07-16 00:17:50.690179] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1396070 00:23:03.902 [2024-07-16 00:17:50.690190] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:03.902 [2024-07-16 00:17:50.690381] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x138f490 00:23:03.902 [2024-07-16 00:17:50.690516] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1396070 00:23:03.902 [2024-07-16 00:17:50.690526] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1396070 00:23:03.902 [2024-07-16 00:17:50.690631] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:03.902 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.160 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.160 "name": "raid_bdev1", 00:23:04.160 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:04.160 "strip_size_kb": 0, 00:23:04.160 "state": "online", 00:23:04.160 "raid_level": "raid1", 00:23:04.160 "superblock": false, 00:23:04.160 "num_base_bdevs": 2, 00:23:04.160 "num_base_bdevs_discovered": 2, 00:23:04.160 "num_base_bdevs_operational": 2, 00:23:04.160 "base_bdevs_list": [ 00:23:04.160 { 00:23:04.160 "name": "BaseBdev1", 00:23:04.160 "uuid": "2de4b935-eba3-5b0d-add2-5f1f0d9229b7", 00:23:04.160 "is_configured": true, 00:23:04.160 "data_offset": 0, 00:23:04.160 "data_size": 65536 00:23:04.160 }, 00:23:04.160 { 00:23:04.160 "name": "BaseBdev2", 00:23:04.160 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:04.160 "is_configured": true, 00:23:04.160 "data_offset": 0, 00:23:04.160 "data_size": 65536 00:23:04.160 } 00:23:04.160 ] 00:23:04.160 }' 00:23:04.160 00:17:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.160 00:17:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:04.735 00:17:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:04.735 00:17:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:04.998 [2024-07-16 00:17:51.715873] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:04.998 00:17:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:05.257 [2024-07-16 00:17:52.152848] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x138f490 00:23:05.257 /dev/nbd0 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:05.257 1+0 records in 00:23:05.257 1+0 records out 00:23:05.257 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284735 s, 14.4 MB/s 00:23:05.257 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:05.578 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:05.578 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:05.578 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:05.578 00:17:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:05.578 00:17:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:05.578 00:17:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:05.578 00:17:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:05.578 00:17:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:05.578 00:17:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:23:10.846 65536+0 records in 00:23:10.846 65536+0 records out 00:23:10.846 33554432 bytes (34 MB, 32 MiB) copied, 5.56204 s, 6.0 MB/s 00:23:10.846 00:17:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:10.846 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:10.846 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:10.846 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:10.846 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:10.846 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:10.846 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:11.105 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:11.105 [2024-07-16 00:17:57.981487] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:11.105 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:11.105 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:11.105 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:11.105 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:11.105 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:11.105 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:11.105 00:17:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:11.105 00:17:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:11.363 [2024-07-16 00:17:58.206133] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.364 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.622 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.622 "name": "raid_bdev1", 00:23:11.622 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:11.622 "strip_size_kb": 0, 00:23:11.622 "state": "online", 00:23:11.622 "raid_level": "raid1", 00:23:11.622 "superblock": false, 00:23:11.622 "num_base_bdevs": 2, 00:23:11.622 "num_base_bdevs_discovered": 1, 00:23:11.622 "num_base_bdevs_operational": 1, 00:23:11.622 "base_bdevs_list": [ 00:23:11.622 { 00:23:11.622 "name": null, 00:23:11.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.622 "is_configured": false, 00:23:11.622 "data_offset": 0, 00:23:11.622 "data_size": 65536 00:23:11.622 }, 00:23:11.622 { 00:23:11.622 "name": "BaseBdev2", 00:23:11.622 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:11.622 "is_configured": true, 00:23:11.622 "data_offset": 0, 00:23:11.622 "data_size": 65536 00:23:11.622 } 00:23:11.622 ] 00:23:11.622 }' 00:23:11.622 00:17:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.622 00:17:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:12.189 00:17:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:12.447 [2024-07-16 00:17:59.289009] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:12.447 [2024-07-16 00:17:59.294083] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1396880 00:23:12.447 [2024-07-16 00:17:59.296332] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:12.447 00:17:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:13.384 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:13.384 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:13.384 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:13.384 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:13.384 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:13.384 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.384 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.643 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:13.643 "name": "raid_bdev1", 00:23:13.643 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:13.643 "strip_size_kb": 0, 00:23:13.643 "state": "online", 00:23:13.643 "raid_level": "raid1", 00:23:13.643 "superblock": false, 00:23:13.643 "num_base_bdevs": 2, 00:23:13.643 "num_base_bdevs_discovered": 2, 00:23:13.643 "num_base_bdevs_operational": 2, 00:23:13.643 "process": { 00:23:13.643 "type": "rebuild", 00:23:13.643 "target": "spare", 00:23:13.643 "progress": { 00:23:13.643 "blocks": 24576, 00:23:13.643 "percent": 37 00:23:13.643 } 00:23:13.643 }, 00:23:13.643 "base_bdevs_list": [ 00:23:13.643 { 00:23:13.643 "name": "spare", 00:23:13.643 "uuid": "4319b221-e58a-53d8-b045-04b940cdc98f", 00:23:13.643 "is_configured": true, 00:23:13.643 "data_offset": 0, 00:23:13.643 "data_size": 65536 00:23:13.643 }, 00:23:13.643 { 00:23:13.643 "name": "BaseBdev2", 00:23:13.643 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:13.643 "is_configured": true, 00:23:13.643 "data_offset": 0, 00:23:13.643 "data_size": 65536 00:23:13.643 } 00:23:13.643 ] 00:23:13.643 }' 00:23:13.643 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:13.902 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:13.902 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:13.902 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:13.902 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:14.161 [2024-07-16 00:18:00.886223] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:14.161 [2024-07-16 00:18:00.909056] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:14.161 [2024-07-16 00:18:00.909105] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:14.161 [2024-07-16 00:18:00.909120] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:14.161 [2024-07-16 00:18:00.909128] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.161 00:18:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.420 00:18:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:14.420 "name": "raid_bdev1", 00:23:14.420 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:14.420 "strip_size_kb": 0, 00:23:14.420 "state": "online", 00:23:14.420 "raid_level": "raid1", 00:23:14.420 "superblock": false, 00:23:14.420 "num_base_bdevs": 2, 00:23:14.420 "num_base_bdevs_discovered": 1, 00:23:14.420 "num_base_bdevs_operational": 1, 00:23:14.420 "base_bdevs_list": [ 00:23:14.420 { 00:23:14.420 "name": null, 00:23:14.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.420 "is_configured": false, 00:23:14.420 "data_offset": 0, 00:23:14.420 "data_size": 65536 00:23:14.420 }, 00:23:14.420 { 00:23:14.420 "name": "BaseBdev2", 00:23:14.420 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:14.420 "is_configured": true, 00:23:14.420 "data_offset": 0, 00:23:14.420 "data_size": 65536 00:23:14.420 } 00:23:14.420 ] 00:23:14.420 }' 00:23:14.420 00:18:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:14.420 00:18:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:14.987 00:18:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:14.987 00:18:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:14.987 00:18:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:14.987 00:18:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:14.987 00:18:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:14.987 00:18:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.987 00:18:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.246 00:18:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:15.246 "name": "raid_bdev1", 00:23:15.246 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:15.246 "strip_size_kb": 0, 00:23:15.246 "state": "online", 00:23:15.246 "raid_level": "raid1", 00:23:15.246 "superblock": false, 00:23:15.246 "num_base_bdevs": 2, 00:23:15.246 "num_base_bdevs_discovered": 1, 00:23:15.246 "num_base_bdevs_operational": 1, 00:23:15.246 "base_bdevs_list": [ 00:23:15.246 { 00:23:15.246 "name": null, 00:23:15.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.246 "is_configured": false, 00:23:15.246 "data_offset": 0, 00:23:15.246 "data_size": 65536 00:23:15.246 }, 00:23:15.246 { 00:23:15.246 "name": "BaseBdev2", 00:23:15.246 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:15.246 "is_configured": true, 00:23:15.246 "data_offset": 0, 00:23:15.246 "data_size": 65536 00:23:15.246 } 00:23:15.246 ] 00:23:15.246 }' 00:23:15.246 00:18:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:15.246 00:18:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:15.246 00:18:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:15.246 00:18:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:15.246 00:18:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:15.505 [2024-07-16 00:18:02.377515] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:15.505 [2024-07-16 00:18:02.383145] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x138f490 00:23:15.505 [2024-07-16 00:18:02.384692] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:15.505 00:18:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:16.892 "name": "raid_bdev1", 00:23:16.892 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:16.892 "strip_size_kb": 0, 00:23:16.892 "state": "online", 00:23:16.892 "raid_level": "raid1", 00:23:16.892 "superblock": false, 00:23:16.892 "num_base_bdevs": 2, 00:23:16.892 "num_base_bdevs_discovered": 2, 00:23:16.892 "num_base_bdevs_operational": 2, 00:23:16.892 "process": { 00:23:16.892 "type": "rebuild", 00:23:16.892 "target": "spare", 00:23:16.892 "progress": { 00:23:16.892 "blocks": 24576, 00:23:16.892 "percent": 37 00:23:16.892 } 00:23:16.892 }, 00:23:16.892 "base_bdevs_list": [ 00:23:16.892 { 00:23:16.892 "name": "spare", 00:23:16.892 "uuid": "4319b221-e58a-53d8-b045-04b940cdc98f", 00:23:16.892 "is_configured": true, 00:23:16.892 "data_offset": 0, 00:23:16.892 "data_size": 65536 00:23:16.892 }, 00:23:16.892 { 00:23:16.892 "name": "BaseBdev2", 00:23:16.892 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:16.892 "is_configured": true, 00:23:16.892 "data_offset": 0, 00:23:16.892 "data_size": 65536 00:23:16.892 } 00:23:16.892 ] 00:23:16.892 }' 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=793 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.892 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.151 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.151 "name": "raid_bdev1", 00:23:17.151 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:17.151 "strip_size_kb": 0, 00:23:17.151 "state": "online", 00:23:17.151 "raid_level": "raid1", 00:23:17.151 "superblock": false, 00:23:17.151 "num_base_bdevs": 2, 00:23:17.151 "num_base_bdevs_discovered": 2, 00:23:17.151 "num_base_bdevs_operational": 2, 00:23:17.151 "process": { 00:23:17.151 "type": "rebuild", 00:23:17.151 "target": "spare", 00:23:17.151 "progress": { 00:23:17.151 "blocks": 30720, 00:23:17.151 "percent": 46 00:23:17.151 } 00:23:17.151 }, 00:23:17.151 "base_bdevs_list": [ 00:23:17.151 { 00:23:17.151 "name": "spare", 00:23:17.151 "uuid": "4319b221-e58a-53d8-b045-04b940cdc98f", 00:23:17.151 "is_configured": true, 00:23:17.151 "data_offset": 0, 00:23:17.151 "data_size": 65536 00:23:17.151 }, 00:23:17.151 { 00:23:17.151 "name": "BaseBdev2", 00:23:17.151 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:17.151 "is_configured": true, 00:23:17.151 "data_offset": 0, 00:23:17.151 "data_size": 65536 00:23:17.151 } 00:23:17.151 ] 00:23:17.151 }' 00:23:17.151 00:18:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.151 00:18:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:17.151 00:18:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.151 00:18:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:17.151 00:18:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:18.528 "name": "raid_bdev1", 00:23:18.528 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:18.528 "strip_size_kb": 0, 00:23:18.528 "state": "online", 00:23:18.528 "raid_level": "raid1", 00:23:18.528 "superblock": false, 00:23:18.528 "num_base_bdevs": 2, 00:23:18.528 "num_base_bdevs_discovered": 2, 00:23:18.528 "num_base_bdevs_operational": 2, 00:23:18.528 "process": { 00:23:18.528 "type": "rebuild", 00:23:18.528 "target": "spare", 00:23:18.528 "progress": { 00:23:18.528 "blocks": 59392, 00:23:18.528 "percent": 90 00:23:18.528 } 00:23:18.528 }, 00:23:18.528 "base_bdevs_list": [ 00:23:18.528 { 00:23:18.528 "name": "spare", 00:23:18.528 "uuid": "4319b221-e58a-53d8-b045-04b940cdc98f", 00:23:18.528 "is_configured": true, 00:23:18.528 "data_offset": 0, 00:23:18.528 "data_size": 65536 00:23:18.528 }, 00:23:18.528 { 00:23:18.528 "name": "BaseBdev2", 00:23:18.528 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:18.528 "is_configured": true, 00:23:18.528 "data_offset": 0, 00:23:18.528 "data_size": 65536 00:23:18.528 } 00:23:18.528 ] 00:23:18.528 }' 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:18.528 00:18:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:18.788 [2024-07-16 00:18:05.609434] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:18.788 [2024-07-16 00:18:05.609496] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:18.788 [2024-07-16 00:18:05.609532] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:19.725 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:19.725 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:19.725 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.725 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:19.725 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:19.725 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.725 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.725 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:19.985 "name": "raid_bdev1", 00:23:19.985 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:19.985 "strip_size_kb": 0, 00:23:19.985 "state": "online", 00:23:19.985 "raid_level": "raid1", 00:23:19.985 "superblock": false, 00:23:19.985 "num_base_bdevs": 2, 00:23:19.985 "num_base_bdevs_discovered": 2, 00:23:19.985 "num_base_bdevs_operational": 2, 00:23:19.985 "base_bdevs_list": [ 00:23:19.985 { 00:23:19.985 "name": "spare", 00:23:19.985 "uuid": "4319b221-e58a-53d8-b045-04b940cdc98f", 00:23:19.985 "is_configured": true, 00:23:19.985 "data_offset": 0, 00:23:19.985 "data_size": 65536 00:23:19.985 }, 00:23:19.985 { 00:23:19.985 "name": "BaseBdev2", 00:23:19.985 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:19.985 "is_configured": true, 00:23:19.985 "data_offset": 0, 00:23:19.985 "data_size": 65536 00:23:19.985 } 00:23:19.985 ] 00:23:19.985 }' 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.985 00:18:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.245 "name": "raid_bdev1", 00:23:20.245 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:20.245 "strip_size_kb": 0, 00:23:20.245 "state": "online", 00:23:20.245 "raid_level": "raid1", 00:23:20.245 "superblock": false, 00:23:20.245 "num_base_bdevs": 2, 00:23:20.245 "num_base_bdevs_discovered": 2, 00:23:20.245 "num_base_bdevs_operational": 2, 00:23:20.245 "base_bdevs_list": [ 00:23:20.245 { 00:23:20.245 "name": "spare", 00:23:20.245 "uuid": "4319b221-e58a-53d8-b045-04b940cdc98f", 00:23:20.245 "is_configured": true, 00:23:20.245 "data_offset": 0, 00:23:20.245 "data_size": 65536 00:23:20.245 }, 00:23:20.245 { 00:23:20.245 "name": "BaseBdev2", 00:23:20.245 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:20.245 "is_configured": true, 00:23:20.245 "data_offset": 0, 00:23:20.245 "data_size": 65536 00:23:20.245 } 00:23:20.245 ] 00:23:20.245 }' 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.245 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.505 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:20.505 "name": "raid_bdev1", 00:23:20.505 "uuid": "6749b172-ac8f-4124-ab57-c054ba5f7ffd", 00:23:20.505 "strip_size_kb": 0, 00:23:20.505 "state": "online", 00:23:20.505 "raid_level": "raid1", 00:23:20.505 "superblock": false, 00:23:20.505 "num_base_bdevs": 2, 00:23:20.505 "num_base_bdevs_discovered": 2, 00:23:20.505 "num_base_bdevs_operational": 2, 00:23:20.505 "base_bdevs_list": [ 00:23:20.505 { 00:23:20.505 "name": "spare", 00:23:20.505 "uuid": "4319b221-e58a-53d8-b045-04b940cdc98f", 00:23:20.505 "is_configured": true, 00:23:20.505 "data_offset": 0, 00:23:20.505 "data_size": 65536 00:23:20.505 }, 00:23:20.505 { 00:23:20.505 "name": "BaseBdev2", 00:23:20.505 "uuid": "56c03b63-1f00-5976-bfba-5f15e9a09752", 00:23:20.505 "is_configured": true, 00:23:20.505 "data_offset": 0, 00:23:20.505 "data_size": 65536 00:23:20.505 } 00:23:20.505 ] 00:23:20.505 }' 00:23:20.505 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:20.505 00:18:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:21.074 00:18:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:21.334 [2024-07-16 00:18:08.137184] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:21.334 [2024-07-16 00:18:08.137216] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:21.334 [2024-07-16 00:18:08.137273] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:21.334 [2024-07-16 00:18:08.137329] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:21.334 [2024-07-16 00:18:08.137341] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1396070 name raid_bdev1, state offline 00:23:21.334 00:18:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.334 00:18:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:21.594 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:21.854 /dev/nbd0 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:21.854 1+0 records in 00:23:21.854 1+0 records out 00:23:21.854 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249034 s, 16.4 MB/s 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:21.854 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:22.114 /dev/nbd1 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:22.114 1+0 records in 00:23:22.114 1+0 records out 00:23:22.114 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000343259 s, 11.9 MB/s 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:22.114 00:18:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:22.374 00:18:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:22.375 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:22.375 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:22.375 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:22.375 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:22.375 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:22.375 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:22.634 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:22.634 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:22.634 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:22.634 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:22.634 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:22.634 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:22.634 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:22.634 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:22.634 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:22.634 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 3597675 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 3597675 ']' 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 3597675 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3597675 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3597675' 00:23:22.908 killing process with pid 3597675 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 3597675 00:23:22.908 Received shutdown signal, test time was about 60.000000 seconds 00:23:22.908 00:23:22.908 Latency(us) 00:23:22.908 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:22.908 =================================================================================================================== 00:23:22.908 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:22.908 [2024-07-16 00:18:09.670107] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:22.908 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 3597675 00:23:22.908 [2024-07-16 00:18:09.696591] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:23:23.167 00:23:23.167 real 0m21.648s 00:23:23.167 user 0m28.538s 00:23:23.167 sys 0m5.001s 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:23.167 ************************************ 00:23:23.167 END TEST raid_rebuild_test 00:23:23.167 ************************************ 00:23:23.167 00:18:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:23.167 00:18:09 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:23:23.167 00:18:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:23.167 00:18:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:23.167 00:18:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:23.167 ************************************ 00:23:23.167 START TEST raid_rebuild_test_sb 00:23:23.167 ************************************ 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:23.167 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=3601214 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 3601214 /var/tmp/spdk-raid.sock 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3601214 ']' 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:23.168 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:23.168 00:18:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:23.168 [2024-07-16 00:18:10.056373] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:23:23.168 [2024-07-16 00:18:10.056441] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3601214 ] 00:23:23.168 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:23.168 Zero copy mechanism will not be used. 00:23:23.429 [2024-07-16 00:18:10.186580] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:23.429 [2024-07-16 00:18:10.289123] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:23.429 [2024-07-16 00:18:10.344110] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:23.429 [2024-07-16 00:18:10.344135] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:23.998 00:18:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:23.998 00:18:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:23:23.998 00:18:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:23.998 00:18:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:24.258 BaseBdev1_malloc 00:23:24.258 00:18:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:24.517 [2024-07-16 00:18:11.418902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:24.517 [2024-07-16 00:18:11.418956] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:24.517 [2024-07-16 00:18:11.418981] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1731d40 00:23:24.517 [2024-07-16 00:18:11.418995] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:24.517 [2024-07-16 00:18:11.420726] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:24.517 [2024-07-16 00:18:11.420754] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:24.517 BaseBdev1 00:23:24.517 00:18:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:24.517 00:18:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:24.775 BaseBdev2_malloc 00:23:24.775 00:18:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:25.033 [2024-07-16 00:18:11.913072] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:25.033 [2024-07-16 00:18:11.913120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:25.033 [2024-07-16 00:18:11.913145] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1732860 00:23:25.033 [2024-07-16 00:18:11.913159] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:25.033 [2024-07-16 00:18:11.914680] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:25.033 [2024-07-16 00:18:11.914708] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:25.033 BaseBdev2 00:23:25.033 00:18:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:25.291 spare_malloc 00:23:25.291 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:25.549 spare_delay 00:23:25.549 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:25.807 [2024-07-16 00:18:12.651623] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:25.807 [2024-07-16 00:18:12.651671] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:25.807 [2024-07-16 00:18:12.651694] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e0ec0 00:23:25.807 [2024-07-16 00:18:12.651706] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:25.807 [2024-07-16 00:18:12.653346] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:25.807 [2024-07-16 00:18:12.653373] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:25.807 spare 00:23:25.807 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:26.065 [2024-07-16 00:18:12.832124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:26.065 [2024-07-16 00:18:12.833296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:26.065 [2024-07-16 00:18:12.833457] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e2070 00:23:26.065 [2024-07-16 00:18:12.833469] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:26.065 [2024-07-16 00:18:12.833648] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18db490 00:23:26.065 [2024-07-16 00:18:12.833781] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e2070 00:23:26.065 [2024-07-16 00:18:12.833791] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18e2070 00:23:26.065 [2024-07-16 00:18:12.833888] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.065 00:18:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.355 00:18:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.355 "name": "raid_bdev1", 00:23:26.355 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:26.355 "strip_size_kb": 0, 00:23:26.355 "state": "online", 00:23:26.355 "raid_level": "raid1", 00:23:26.355 "superblock": true, 00:23:26.355 "num_base_bdevs": 2, 00:23:26.355 "num_base_bdevs_discovered": 2, 00:23:26.355 "num_base_bdevs_operational": 2, 00:23:26.355 "base_bdevs_list": [ 00:23:26.355 { 00:23:26.355 "name": "BaseBdev1", 00:23:26.355 "uuid": "86d807af-6173-585f-b534-0dfa09660e80", 00:23:26.355 "is_configured": true, 00:23:26.355 "data_offset": 2048, 00:23:26.355 "data_size": 63488 00:23:26.355 }, 00:23:26.355 { 00:23:26.355 "name": "BaseBdev2", 00:23:26.355 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:26.355 "is_configured": true, 00:23:26.355 "data_offset": 2048, 00:23:26.355 "data_size": 63488 00:23:26.355 } 00:23:26.355 ] 00:23:26.355 }' 00:23:26.355 00:18:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.355 00:18:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:26.921 00:18:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:26.921 00:18:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:26.922 [2024-07-16 00:18:13.847040] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:27.180 00:18:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:27.180 00:18:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.180 00:18:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:27.180 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:27.438 [2024-07-16 00:18:14.215803] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18db490 00:23:27.438 /dev/nbd0 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:27.438 1+0 records in 00:23:27.438 1+0 records out 00:23:27.438 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022327 s, 18.3 MB/s 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:27.438 00:18:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:23:34.007 63488+0 records in 00:23:34.007 63488+0 records out 00:23:34.007 32505856 bytes (33 MB, 31 MiB) copied, 6.18217 s, 5.3 MB/s 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:34.007 [2024-07-16 00:18:20.729567] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:34.007 00:18:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:34.265 [2024-07-16 00:18:20.974264] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:34.265 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:34.265 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:34.265 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:34.265 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:34.265 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:34.265 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:34.266 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:34.266 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:34.266 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:34.266 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:34.266 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.266 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.525 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.525 "name": "raid_bdev1", 00:23:34.525 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:34.525 "strip_size_kb": 0, 00:23:34.525 "state": "online", 00:23:34.525 "raid_level": "raid1", 00:23:34.525 "superblock": true, 00:23:34.525 "num_base_bdevs": 2, 00:23:34.525 "num_base_bdevs_discovered": 1, 00:23:34.525 "num_base_bdevs_operational": 1, 00:23:34.525 "base_bdevs_list": [ 00:23:34.525 { 00:23:34.525 "name": null, 00:23:34.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.525 "is_configured": false, 00:23:34.525 "data_offset": 2048, 00:23:34.525 "data_size": 63488 00:23:34.525 }, 00:23:34.525 { 00:23:34.525 "name": "BaseBdev2", 00:23:34.525 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:34.525 "is_configured": true, 00:23:34.525 "data_offset": 2048, 00:23:34.525 "data_size": 63488 00:23:34.525 } 00:23:34.525 ] 00:23:34.525 }' 00:23:34.525 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.525 00:18:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:35.092 00:18:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:35.352 [2024-07-16 00:18:22.117300] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:35.352 [2024-07-16 00:18:22.122273] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e1ce0 00:23:35.352 [2024-07-16 00:18:22.124495] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:35.352 00:18:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:36.288 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:36.289 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.289 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:36.289 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:36.289 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.289 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.289 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.546 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.546 "name": "raid_bdev1", 00:23:36.546 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:36.546 "strip_size_kb": 0, 00:23:36.546 "state": "online", 00:23:36.546 "raid_level": "raid1", 00:23:36.546 "superblock": true, 00:23:36.546 "num_base_bdevs": 2, 00:23:36.546 "num_base_bdevs_discovered": 2, 00:23:36.546 "num_base_bdevs_operational": 2, 00:23:36.546 "process": { 00:23:36.546 "type": "rebuild", 00:23:36.546 "target": "spare", 00:23:36.546 "progress": { 00:23:36.546 "blocks": 24576, 00:23:36.546 "percent": 38 00:23:36.546 } 00:23:36.546 }, 00:23:36.546 "base_bdevs_list": [ 00:23:36.546 { 00:23:36.546 "name": "spare", 00:23:36.546 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:36.546 "is_configured": true, 00:23:36.546 "data_offset": 2048, 00:23:36.546 "data_size": 63488 00:23:36.546 }, 00:23:36.546 { 00:23:36.546 "name": "BaseBdev2", 00:23:36.546 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:36.546 "is_configured": true, 00:23:36.546 "data_offset": 2048, 00:23:36.546 "data_size": 63488 00:23:36.546 } 00:23:36.546 ] 00:23:36.546 }' 00:23:36.546 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.546 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:36.546 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.546 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:36.546 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:36.804 [2024-07-16 00:18:23.710888] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:36.804 [2024-07-16 00:18:23.737193] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:36.804 [2024-07-16 00:18:23.737239] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:36.804 [2024-07-16 00:18:23.737256] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:36.804 [2024-07-16 00:18:23.737268] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:37.062 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:37.062 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:37.063 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:37.063 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:37.063 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:37.063 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:37.063 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:37.063 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:37.063 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:37.063 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:37.063 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.063 00:18:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.321 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:37.321 "name": "raid_bdev1", 00:23:37.321 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:37.321 "strip_size_kb": 0, 00:23:37.321 "state": "online", 00:23:37.321 "raid_level": "raid1", 00:23:37.321 "superblock": true, 00:23:37.321 "num_base_bdevs": 2, 00:23:37.321 "num_base_bdevs_discovered": 1, 00:23:37.321 "num_base_bdevs_operational": 1, 00:23:37.321 "base_bdevs_list": [ 00:23:37.321 { 00:23:37.321 "name": null, 00:23:37.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.321 "is_configured": false, 00:23:37.321 "data_offset": 2048, 00:23:37.321 "data_size": 63488 00:23:37.321 }, 00:23:37.321 { 00:23:37.321 "name": "BaseBdev2", 00:23:37.321 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:37.321 "is_configured": true, 00:23:37.321 "data_offset": 2048, 00:23:37.321 "data_size": 63488 00:23:37.321 } 00:23:37.321 ] 00:23:37.321 }' 00:23:37.321 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:37.321 00:18:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:37.889 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:37.889 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:37.889 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:37.889 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:37.889 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:37.889 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.889 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.149 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:38.149 "name": "raid_bdev1", 00:23:38.149 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:38.149 "strip_size_kb": 0, 00:23:38.149 "state": "online", 00:23:38.149 "raid_level": "raid1", 00:23:38.149 "superblock": true, 00:23:38.149 "num_base_bdevs": 2, 00:23:38.149 "num_base_bdevs_discovered": 1, 00:23:38.149 "num_base_bdevs_operational": 1, 00:23:38.149 "base_bdevs_list": [ 00:23:38.149 { 00:23:38.149 "name": null, 00:23:38.149 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.149 "is_configured": false, 00:23:38.149 "data_offset": 2048, 00:23:38.149 "data_size": 63488 00:23:38.149 }, 00:23:38.149 { 00:23:38.149 "name": "BaseBdev2", 00:23:38.149 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:38.149 "is_configured": true, 00:23:38.149 "data_offset": 2048, 00:23:38.149 "data_size": 63488 00:23:38.149 } 00:23:38.149 ] 00:23:38.149 }' 00:23:38.149 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:38.149 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:38.149 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:38.149 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:38.149 00:18:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:38.409 [2024-07-16 00:18:25.213545] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:38.409 [2024-07-16 00:18:25.218477] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e1ce0 00:23:38.409 [2024-07-16 00:18:25.219936] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:38.409 00:18:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:39.348 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:39.348 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:39.348 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:39.348 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:39.348 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:39.348 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.348 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.606 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:39.606 "name": "raid_bdev1", 00:23:39.606 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:39.606 "strip_size_kb": 0, 00:23:39.606 "state": "online", 00:23:39.606 "raid_level": "raid1", 00:23:39.606 "superblock": true, 00:23:39.606 "num_base_bdevs": 2, 00:23:39.606 "num_base_bdevs_discovered": 2, 00:23:39.606 "num_base_bdevs_operational": 2, 00:23:39.606 "process": { 00:23:39.606 "type": "rebuild", 00:23:39.606 "target": "spare", 00:23:39.606 "progress": { 00:23:39.606 "blocks": 24576, 00:23:39.606 "percent": 38 00:23:39.606 } 00:23:39.606 }, 00:23:39.606 "base_bdevs_list": [ 00:23:39.606 { 00:23:39.606 "name": "spare", 00:23:39.606 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:39.606 "is_configured": true, 00:23:39.606 "data_offset": 2048, 00:23:39.606 "data_size": 63488 00:23:39.606 }, 00:23:39.606 { 00:23:39.606 "name": "BaseBdev2", 00:23:39.606 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:39.606 "is_configured": true, 00:23:39.606 "data_offset": 2048, 00:23:39.606 "data_size": 63488 00:23:39.606 } 00:23:39.606 ] 00:23:39.606 }' 00:23:39.606 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:39.606 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:39.606 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:39.866 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=816 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.866 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.125 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:40.125 "name": "raid_bdev1", 00:23:40.125 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:40.125 "strip_size_kb": 0, 00:23:40.125 "state": "online", 00:23:40.125 "raid_level": "raid1", 00:23:40.125 "superblock": true, 00:23:40.125 "num_base_bdevs": 2, 00:23:40.125 "num_base_bdevs_discovered": 2, 00:23:40.125 "num_base_bdevs_operational": 2, 00:23:40.125 "process": { 00:23:40.125 "type": "rebuild", 00:23:40.125 "target": "spare", 00:23:40.125 "progress": { 00:23:40.125 "blocks": 30720, 00:23:40.125 "percent": 48 00:23:40.125 } 00:23:40.125 }, 00:23:40.125 "base_bdevs_list": [ 00:23:40.125 { 00:23:40.125 "name": "spare", 00:23:40.125 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:40.125 "is_configured": true, 00:23:40.125 "data_offset": 2048, 00:23:40.125 "data_size": 63488 00:23:40.125 }, 00:23:40.125 { 00:23:40.126 "name": "BaseBdev2", 00:23:40.126 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:40.126 "is_configured": true, 00:23:40.126 "data_offset": 2048, 00:23:40.126 "data_size": 63488 00:23:40.126 } 00:23:40.126 ] 00:23:40.126 }' 00:23:40.126 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:40.126 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:40.126 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:40.126 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:40.126 00:18:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:41.064 00:18:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:41.064 00:18:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:41.064 00:18:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:41.064 00:18:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:41.064 00:18:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:41.064 00:18:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:41.064 00:18:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.064 00:18:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:41.323 00:18:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:41.323 "name": "raid_bdev1", 00:23:41.323 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:41.323 "strip_size_kb": 0, 00:23:41.323 "state": "online", 00:23:41.323 "raid_level": "raid1", 00:23:41.323 "superblock": true, 00:23:41.323 "num_base_bdevs": 2, 00:23:41.323 "num_base_bdevs_discovered": 2, 00:23:41.323 "num_base_bdevs_operational": 2, 00:23:41.323 "process": { 00:23:41.323 "type": "rebuild", 00:23:41.323 "target": "spare", 00:23:41.323 "progress": { 00:23:41.323 "blocks": 59392, 00:23:41.323 "percent": 93 00:23:41.323 } 00:23:41.323 }, 00:23:41.323 "base_bdevs_list": [ 00:23:41.323 { 00:23:41.323 "name": "spare", 00:23:41.323 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:41.323 "is_configured": true, 00:23:41.323 "data_offset": 2048, 00:23:41.323 "data_size": 63488 00:23:41.323 }, 00:23:41.323 { 00:23:41.323 "name": "BaseBdev2", 00:23:41.323 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:41.323 "is_configured": true, 00:23:41.323 "data_offset": 2048, 00:23:41.323 "data_size": 63488 00:23:41.323 } 00:23:41.323 ] 00:23:41.323 }' 00:23:41.323 00:18:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:41.323 00:18:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:41.323 00:18:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:41.581 00:18:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:41.581 00:18:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:41.581 [2024-07-16 00:18:28.344401] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:41.581 [2024-07-16 00:18:28.344458] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:41.581 [2024-07-16 00:18:28.344548] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:42.555 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:42.555 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:42.555 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:42.555 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:42.555 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:42.555 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:42.555 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.555 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:42.814 "name": "raid_bdev1", 00:23:42.814 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:42.814 "strip_size_kb": 0, 00:23:42.814 "state": "online", 00:23:42.814 "raid_level": "raid1", 00:23:42.814 "superblock": true, 00:23:42.814 "num_base_bdevs": 2, 00:23:42.814 "num_base_bdevs_discovered": 2, 00:23:42.814 "num_base_bdevs_operational": 2, 00:23:42.814 "base_bdevs_list": [ 00:23:42.814 { 00:23:42.814 "name": "spare", 00:23:42.814 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:42.814 "is_configured": true, 00:23:42.814 "data_offset": 2048, 00:23:42.814 "data_size": 63488 00:23:42.814 }, 00:23:42.814 { 00:23:42.814 "name": "BaseBdev2", 00:23:42.814 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:42.814 "is_configured": true, 00:23:42.814 "data_offset": 2048, 00:23:42.814 "data_size": 63488 00:23:42.814 } 00:23:42.814 ] 00:23:42.814 }' 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:42.814 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:43.076 "name": "raid_bdev1", 00:23:43.076 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:43.076 "strip_size_kb": 0, 00:23:43.076 "state": "online", 00:23:43.076 "raid_level": "raid1", 00:23:43.076 "superblock": true, 00:23:43.076 "num_base_bdevs": 2, 00:23:43.076 "num_base_bdevs_discovered": 2, 00:23:43.076 "num_base_bdevs_operational": 2, 00:23:43.076 "base_bdevs_list": [ 00:23:43.076 { 00:23:43.076 "name": "spare", 00:23:43.076 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:43.076 "is_configured": true, 00:23:43.076 "data_offset": 2048, 00:23:43.076 "data_size": 63488 00:23:43.076 }, 00:23:43.076 { 00:23:43.076 "name": "BaseBdev2", 00:23:43.076 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:43.076 "is_configured": true, 00:23:43.076 "data_offset": 2048, 00:23:43.076 "data_size": 63488 00:23:43.076 } 00:23:43.076 ] 00:23:43.076 }' 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.076 00:18:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.335 00:18:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:43.335 "name": "raid_bdev1", 00:23:43.335 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:43.335 "strip_size_kb": 0, 00:23:43.335 "state": "online", 00:23:43.335 "raid_level": "raid1", 00:23:43.335 "superblock": true, 00:23:43.335 "num_base_bdevs": 2, 00:23:43.335 "num_base_bdevs_discovered": 2, 00:23:43.335 "num_base_bdevs_operational": 2, 00:23:43.335 "base_bdevs_list": [ 00:23:43.335 { 00:23:43.335 "name": "spare", 00:23:43.335 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:43.335 "is_configured": true, 00:23:43.335 "data_offset": 2048, 00:23:43.335 "data_size": 63488 00:23:43.335 }, 00:23:43.335 { 00:23:43.335 "name": "BaseBdev2", 00:23:43.335 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:43.335 "is_configured": true, 00:23:43.335 "data_offset": 2048, 00:23:43.335 "data_size": 63488 00:23:43.335 } 00:23:43.335 ] 00:23:43.335 }' 00:23:43.335 00:18:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:43.335 00:18:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:43.901 00:18:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:44.160 [2024-07-16 00:18:31.056485] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:44.160 [2024-07-16 00:18:31.056510] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:44.160 [2024-07-16 00:18:31.056566] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:44.160 [2024-07-16 00:18:31.056618] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:44.160 [2024-07-16 00:18:31.056630] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e2070 name raid_bdev1, state offline 00:23:44.160 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.160 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:44.418 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:44.677 /dev/nbd0 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:44.677 1+0 records in 00:23:44.677 1+0 records out 00:23:44.677 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256517 s, 16.0 MB/s 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:44.677 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:45.035 /dev/nbd1 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:45.035 1+0 records in 00:23:45.035 1+0 records out 00:23:45.035 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337256 s, 12.1 MB/s 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:45.035 00:18:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:45.294 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:45.294 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:45.294 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:45.294 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:45.294 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:45.294 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:45.294 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:45.294 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:45.294 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:45.294 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:45.553 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:45.553 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:45.553 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:45.554 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:45.554 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:45.554 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:45.554 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:45.554 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:45.554 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:45.554 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:45.813 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:46.073 [2024-07-16 00:18:32.968366] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:46.073 [2024-07-16 00:18:32.968414] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:46.073 [2024-07-16 00:18:32.968436] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e1500 00:23:46.073 [2024-07-16 00:18:32.968449] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:46.073 [2024-07-16 00:18:32.970093] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:46.073 [2024-07-16 00:18:32.970122] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:46.073 [2024-07-16 00:18:32.970203] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:46.073 [2024-07-16 00:18:32.970229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:46.073 [2024-07-16 00:18:32.970326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:46.073 spare 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.073 00:18:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.333 [2024-07-16 00:18:33.070639] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e0260 00:23:46.333 [2024-07-16 00:18:33.070655] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:46.333 [2024-07-16 00:18:33.070847] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18daf50 00:23:46.333 [2024-07-16 00:18:33.070998] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e0260 00:23:46.333 [2024-07-16 00:18:33.071009] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18e0260 00:23:46.333 [2024-07-16 00:18:33.071110] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:46.333 00:18:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:46.333 "name": "raid_bdev1", 00:23:46.333 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:46.333 "strip_size_kb": 0, 00:23:46.333 "state": "online", 00:23:46.333 "raid_level": "raid1", 00:23:46.333 "superblock": true, 00:23:46.333 "num_base_bdevs": 2, 00:23:46.333 "num_base_bdevs_discovered": 2, 00:23:46.333 "num_base_bdevs_operational": 2, 00:23:46.333 "base_bdevs_list": [ 00:23:46.333 { 00:23:46.333 "name": "spare", 00:23:46.333 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:46.333 "is_configured": true, 00:23:46.333 "data_offset": 2048, 00:23:46.333 "data_size": 63488 00:23:46.333 }, 00:23:46.333 { 00:23:46.333 "name": "BaseBdev2", 00:23:46.333 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:46.333 "is_configured": true, 00:23:46.333 "data_offset": 2048, 00:23:46.333 "data_size": 63488 00:23:46.333 } 00:23:46.333 ] 00:23:46.333 }' 00:23:46.333 00:18:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:46.333 00:18:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:47.269 00:18:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:47.269 00:18:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:47.269 00:18:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:47.269 00:18:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:47.269 00:18:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:47.269 00:18:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.269 00:18:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.269 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:47.269 "name": "raid_bdev1", 00:23:47.269 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:47.269 "strip_size_kb": 0, 00:23:47.269 "state": "online", 00:23:47.269 "raid_level": "raid1", 00:23:47.269 "superblock": true, 00:23:47.269 "num_base_bdevs": 2, 00:23:47.269 "num_base_bdevs_discovered": 2, 00:23:47.269 "num_base_bdevs_operational": 2, 00:23:47.269 "base_bdevs_list": [ 00:23:47.269 { 00:23:47.269 "name": "spare", 00:23:47.269 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:47.269 "is_configured": true, 00:23:47.269 "data_offset": 2048, 00:23:47.269 "data_size": 63488 00:23:47.269 }, 00:23:47.269 { 00:23:47.269 "name": "BaseBdev2", 00:23:47.269 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:47.269 "is_configured": true, 00:23:47.269 "data_offset": 2048, 00:23:47.269 "data_size": 63488 00:23:47.269 } 00:23:47.269 ] 00:23:47.269 }' 00:23:47.269 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:47.269 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:47.269 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:47.270 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:47.270 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.270 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:47.529 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:47.529 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:47.788 [2024-07-16 00:18:34.685058] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.788 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.047 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:48.047 "name": "raid_bdev1", 00:23:48.047 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:48.047 "strip_size_kb": 0, 00:23:48.047 "state": "online", 00:23:48.047 "raid_level": "raid1", 00:23:48.047 "superblock": true, 00:23:48.047 "num_base_bdevs": 2, 00:23:48.047 "num_base_bdevs_discovered": 1, 00:23:48.047 "num_base_bdevs_operational": 1, 00:23:48.047 "base_bdevs_list": [ 00:23:48.047 { 00:23:48.047 "name": null, 00:23:48.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:48.047 "is_configured": false, 00:23:48.047 "data_offset": 2048, 00:23:48.047 "data_size": 63488 00:23:48.047 }, 00:23:48.047 { 00:23:48.047 "name": "BaseBdev2", 00:23:48.047 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:48.047 "is_configured": true, 00:23:48.047 "data_offset": 2048, 00:23:48.047 "data_size": 63488 00:23:48.047 } 00:23:48.047 ] 00:23:48.047 }' 00:23:48.047 00:18:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:48.047 00:18:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:48.615 00:18:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:48.874 [2024-07-16 00:18:35.767943] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:48.874 [2024-07-16 00:18:35.768087] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:48.874 [2024-07-16 00:18:35.768104] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:48.874 [2024-07-16 00:18:35.768132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:48.874 [2024-07-16 00:18:35.772940] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18daf50 00:23:48.874 [2024-07-16 00:18:35.775251] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:48.874 00:18:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:50.253 00:18:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:50.253 00:18:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:50.253 00:18:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:50.253 00:18:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:50.253 00:18:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:50.253 00:18:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.253 00:18:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.253 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:50.253 "name": "raid_bdev1", 00:23:50.253 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:50.253 "strip_size_kb": 0, 00:23:50.253 "state": "online", 00:23:50.253 "raid_level": "raid1", 00:23:50.253 "superblock": true, 00:23:50.253 "num_base_bdevs": 2, 00:23:50.253 "num_base_bdevs_discovered": 2, 00:23:50.253 "num_base_bdevs_operational": 2, 00:23:50.253 "process": { 00:23:50.253 "type": "rebuild", 00:23:50.253 "target": "spare", 00:23:50.253 "progress": { 00:23:50.253 "blocks": 24576, 00:23:50.253 "percent": 38 00:23:50.253 } 00:23:50.253 }, 00:23:50.253 "base_bdevs_list": [ 00:23:50.253 { 00:23:50.253 "name": "spare", 00:23:50.253 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:50.253 "is_configured": true, 00:23:50.253 "data_offset": 2048, 00:23:50.253 "data_size": 63488 00:23:50.253 }, 00:23:50.253 { 00:23:50.253 "name": "BaseBdev2", 00:23:50.253 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:50.253 "is_configured": true, 00:23:50.253 "data_offset": 2048, 00:23:50.253 "data_size": 63488 00:23:50.253 } 00:23:50.253 ] 00:23:50.253 }' 00:23:50.253 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:50.253 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:50.253 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:50.253 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:50.253 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:50.512 [2024-07-16 00:18:37.349291] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:50.512 [2024-07-16 00:18:37.387823] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:50.512 [2024-07-16 00:18:37.387869] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:50.512 [2024-07-16 00:18:37.387884] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:50.512 [2024-07-16 00:18:37.387893] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.512 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.771 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:50.771 "name": "raid_bdev1", 00:23:50.771 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:50.771 "strip_size_kb": 0, 00:23:50.771 "state": "online", 00:23:50.771 "raid_level": "raid1", 00:23:50.771 "superblock": true, 00:23:50.771 "num_base_bdevs": 2, 00:23:50.771 "num_base_bdevs_discovered": 1, 00:23:50.771 "num_base_bdevs_operational": 1, 00:23:50.771 "base_bdevs_list": [ 00:23:50.771 { 00:23:50.771 "name": null, 00:23:50.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.771 "is_configured": false, 00:23:50.771 "data_offset": 2048, 00:23:50.771 "data_size": 63488 00:23:50.771 }, 00:23:50.771 { 00:23:50.771 "name": "BaseBdev2", 00:23:50.771 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:50.771 "is_configured": true, 00:23:50.771 "data_offset": 2048, 00:23:50.771 "data_size": 63488 00:23:50.771 } 00:23:50.771 ] 00:23:50.771 }' 00:23:50.771 00:18:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:50.771 00:18:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:51.706 00:18:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:51.965 [2024-07-16 00:18:38.784273] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:51.965 [2024-07-16 00:18:38.784322] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:51.965 [2024-07-16 00:18:38.784345] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e1730 00:23:51.965 [2024-07-16 00:18:38.784358] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:51.965 [2024-07-16 00:18:38.784722] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:51.965 [2024-07-16 00:18:38.784740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:51.965 [2024-07-16 00:18:38.784820] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:51.965 [2024-07-16 00:18:38.784832] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:51.965 [2024-07-16 00:18:38.784842] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:51.965 [2024-07-16 00:18:38.784860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:51.965 [2024-07-16 00:18:38.789705] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e2aa0 00:23:51.965 spare 00:23:51.965 [2024-07-16 00:18:38.791156] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:51.965 00:18:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:52.899 00:18:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:52.899 00:18:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:52.899 00:18:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:52.899 00:18:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:52.899 00:18:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:52.899 00:18:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.899 00:18:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.158 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:53.158 "name": "raid_bdev1", 00:23:53.158 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:53.158 "strip_size_kb": 0, 00:23:53.158 "state": "online", 00:23:53.158 "raid_level": "raid1", 00:23:53.158 "superblock": true, 00:23:53.158 "num_base_bdevs": 2, 00:23:53.158 "num_base_bdevs_discovered": 2, 00:23:53.158 "num_base_bdevs_operational": 2, 00:23:53.158 "process": { 00:23:53.158 "type": "rebuild", 00:23:53.158 "target": "spare", 00:23:53.158 "progress": { 00:23:53.158 "blocks": 24576, 00:23:53.158 "percent": 38 00:23:53.158 } 00:23:53.158 }, 00:23:53.158 "base_bdevs_list": [ 00:23:53.158 { 00:23:53.158 "name": "spare", 00:23:53.158 "uuid": "9570b63f-51fa-5d19-9f37-53252af3193a", 00:23:53.158 "is_configured": true, 00:23:53.158 "data_offset": 2048, 00:23:53.158 "data_size": 63488 00:23:53.158 }, 00:23:53.158 { 00:23:53.158 "name": "BaseBdev2", 00:23:53.158 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:53.158 "is_configured": true, 00:23:53.158 "data_offset": 2048, 00:23:53.158 "data_size": 63488 00:23:53.158 } 00:23:53.158 ] 00:23:53.158 }' 00:23:53.158 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:53.416 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:53.416 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:53.416 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:53.416 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:53.674 [2024-07-16 00:18:40.394150] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:53.674 [2024-07-16 00:18:40.403533] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:53.674 [2024-07-16 00:18:40.403581] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:53.674 [2024-07-16 00:18:40.403597] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:53.674 [2024-07-16 00:18:40.403606] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.674 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.241 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:54.241 "name": "raid_bdev1", 00:23:54.241 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:54.241 "strip_size_kb": 0, 00:23:54.241 "state": "online", 00:23:54.241 "raid_level": "raid1", 00:23:54.241 "superblock": true, 00:23:54.241 "num_base_bdevs": 2, 00:23:54.241 "num_base_bdevs_discovered": 1, 00:23:54.241 "num_base_bdevs_operational": 1, 00:23:54.241 "base_bdevs_list": [ 00:23:54.241 { 00:23:54.241 "name": null, 00:23:54.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:54.241 "is_configured": false, 00:23:54.241 "data_offset": 2048, 00:23:54.241 "data_size": 63488 00:23:54.241 }, 00:23:54.241 { 00:23:54.241 "name": "BaseBdev2", 00:23:54.241 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:54.241 "is_configured": true, 00:23:54.241 "data_offset": 2048, 00:23:54.241 "data_size": 63488 00:23:54.241 } 00:23:54.241 ] 00:23:54.241 }' 00:23:54.241 00:18:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:54.241 00:18:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:54.807 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:54.807 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:54.807 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:54.807 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:54.807 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:54.807 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.807 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.065 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:55.065 "name": "raid_bdev1", 00:23:55.065 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:55.065 "strip_size_kb": 0, 00:23:55.065 "state": "online", 00:23:55.065 "raid_level": "raid1", 00:23:55.065 "superblock": true, 00:23:55.065 "num_base_bdevs": 2, 00:23:55.065 "num_base_bdevs_discovered": 1, 00:23:55.065 "num_base_bdevs_operational": 1, 00:23:55.065 "base_bdevs_list": [ 00:23:55.065 { 00:23:55.065 "name": null, 00:23:55.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:55.065 "is_configured": false, 00:23:55.065 "data_offset": 2048, 00:23:55.065 "data_size": 63488 00:23:55.065 }, 00:23:55.065 { 00:23:55.065 "name": "BaseBdev2", 00:23:55.065 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:55.065 "is_configured": true, 00:23:55.065 "data_offset": 2048, 00:23:55.065 "data_size": 63488 00:23:55.065 } 00:23:55.065 ] 00:23:55.065 }' 00:23:55.065 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:55.065 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:55.065 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:55.065 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:55.065 00:18:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:55.324 00:18:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:55.582 [2024-07-16 00:18:42.289188] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:55.582 [2024-07-16 00:18:42.289234] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:55.582 [2024-07-16 00:18:42.289256] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18dc650 00:23:55.582 [2024-07-16 00:18:42.289269] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:55.582 [2024-07-16 00:18:42.289606] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:55.582 [2024-07-16 00:18:42.289624] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:55.582 [2024-07-16 00:18:42.289685] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:55.582 [2024-07-16 00:18:42.289696] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:55.582 [2024-07-16 00:18:42.289706] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:55.582 BaseBdev1 00:23:55.582 00:18:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.527 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.787 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:56.787 "name": "raid_bdev1", 00:23:56.787 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:56.787 "strip_size_kb": 0, 00:23:56.787 "state": "online", 00:23:56.787 "raid_level": "raid1", 00:23:56.787 "superblock": true, 00:23:56.787 "num_base_bdevs": 2, 00:23:56.787 "num_base_bdevs_discovered": 1, 00:23:56.787 "num_base_bdevs_operational": 1, 00:23:56.787 "base_bdevs_list": [ 00:23:56.787 { 00:23:56.787 "name": null, 00:23:56.787 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.787 "is_configured": false, 00:23:56.787 "data_offset": 2048, 00:23:56.787 "data_size": 63488 00:23:56.787 }, 00:23:56.787 { 00:23:56.787 "name": "BaseBdev2", 00:23:56.787 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:56.787 "is_configured": true, 00:23:56.787 "data_offset": 2048, 00:23:56.787 "data_size": 63488 00:23:56.787 } 00:23:56.787 ] 00:23:56.787 }' 00:23:56.787 00:18:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:56.787 00:18:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:57.352 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:57.352 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:57.352 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:57.352 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:57.352 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:57.352 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.352 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.919 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:57.919 "name": "raid_bdev1", 00:23:57.919 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:57.919 "strip_size_kb": 0, 00:23:57.919 "state": "online", 00:23:57.920 "raid_level": "raid1", 00:23:57.920 "superblock": true, 00:23:57.920 "num_base_bdevs": 2, 00:23:57.920 "num_base_bdevs_discovered": 1, 00:23:57.920 "num_base_bdevs_operational": 1, 00:23:57.920 "base_bdevs_list": [ 00:23:57.920 { 00:23:57.920 "name": null, 00:23:57.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:57.920 "is_configured": false, 00:23:57.920 "data_offset": 2048, 00:23:57.920 "data_size": 63488 00:23:57.920 }, 00:23:57.920 { 00:23:57.920 "name": "BaseBdev2", 00:23:57.920 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:57.920 "is_configured": true, 00:23:57.920 "data_offset": 2048, 00:23:57.920 "data_size": 63488 00:23:57.920 } 00:23:57.920 ] 00:23:57.920 }' 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:57.920 00:18:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:58.485 [2024-07-16 00:18:45.325260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:58.485 [2024-07-16 00:18:45.325384] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:58.485 [2024-07-16 00:18:45.325399] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:58.485 request: 00:23:58.485 { 00:23:58.485 "base_bdev": "BaseBdev1", 00:23:58.485 "raid_bdev": "raid_bdev1", 00:23:58.485 "method": "bdev_raid_add_base_bdev", 00:23:58.485 "req_id": 1 00:23:58.485 } 00:23:58.485 Got JSON-RPC error response 00:23:58.485 response: 00:23:58.485 { 00:23:58.485 "code": -22, 00:23:58.485 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:58.485 } 00:23:58.485 00:18:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:23:58.485 00:18:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:58.485 00:18:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:58.486 00:18:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:58.486 00:18:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.420 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.678 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:59.678 "name": "raid_bdev1", 00:23:59.678 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:23:59.678 "strip_size_kb": 0, 00:23:59.678 "state": "online", 00:23:59.678 "raid_level": "raid1", 00:23:59.678 "superblock": true, 00:23:59.678 "num_base_bdevs": 2, 00:23:59.679 "num_base_bdevs_discovered": 1, 00:23:59.679 "num_base_bdevs_operational": 1, 00:23:59.679 "base_bdevs_list": [ 00:23:59.679 { 00:23:59.679 "name": null, 00:23:59.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.679 "is_configured": false, 00:23:59.679 "data_offset": 2048, 00:23:59.679 "data_size": 63488 00:23:59.679 }, 00:23:59.679 { 00:23:59.679 "name": "BaseBdev2", 00:23:59.679 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:23:59.679 "is_configured": true, 00:23:59.679 "data_offset": 2048, 00:23:59.679 "data_size": 63488 00:23:59.679 } 00:23:59.679 ] 00:23:59.679 }' 00:23:59.679 00:18:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:59.679 00:18:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:00.612 "name": "raid_bdev1", 00:24:00.612 "uuid": "1b790b8a-fc46-4ce8-a17b-6a0a75068570", 00:24:00.612 "strip_size_kb": 0, 00:24:00.612 "state": "online", 00:24:00.612 "raid_level": "raid1", 00:24:00.612 "superblock": true, 00:24:00.612 "num_base_bdevs": 2, 00:24:00.612 "num_base_bdevs_discovered": 1, 00:24:00.612 "num_base_bdevs_operational": 1, 00:24:00.612 "base_bdevs_list": [ 00:24:00.612 { 00:24:00.612 "name": null, 00:24:00.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:00.612 "is_configured": false, 00:24:00.612 "data_offset": 2048, 00:24:00.612 "data_size": 63488 00:24:00.612 }, 00:24:00.612 { 00:24:00.612 "name": "BaseBdev2", 00:24:00.612 "uuid": "2c4043e2-0327-5312-a2da-2abe06a1c5cd", 00:24:00.612 "is_configured": true, 00:24:00.612 "data_offset": 2048, 00:24:00.612 "data_size": 63488 00:24:00.612 } 00:24:00.612 ] 00:24:00.612 }' 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:00.612 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 3601214 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3601214 ']' 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 3601214 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3601214 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3601214' 00:24:00.870 killing process with pid 3601214 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 3601214 00:24:00.870 Received shutdown signal, test time was about 60.000000 seconds 00:24:00.870 00:24:00.870 Latency(us) 00:24:00.870 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:00.870 =================================================================================================================== 00:24:00.870 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:00.870 [2024-07-16 00:18:47.673741] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:00.870 [2024-07-16 00:18:47.673830] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:00.870 [2024-07-16 00:18:47.673871] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:00.870 [2024-07-16 00:18:47.673882] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e0260 name raid_bdev1, state offline 00:24:00.870 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 3601214 00:24:00.870 [2024-07-16 00:18:47.700696] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:01.129 00:18:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:24:01.129 00:24:01.129 real 0m37.918s 00:24:01.129 user 0m54.642s 00:24:01.129 sys 0m7.392s 00:24:01.129 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:01.129 00:18:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:01.129 ************************************ 00:24:01.129 END TEST raid_rebuild_test_sb 00:24:01.129 ************************************ 00:24:01.130 00:18:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:01.130 00:18:47 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:24:01.130 00:18:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:01.130 00:18:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:01.130 00:18:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:01.130 ************************************ 00:24:01.130 START TEST raid_rebuild_test_io 00:24:01.130 ************************************ 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=3606563 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 3606563 /var/tmp/spdk-raid.sock 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 3606563 ']' 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:01.130 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:01.130 00:18:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:01.389 [2024-07-16 00:18:48.104653] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:24:01.389 [2024-07-16 00:18:48.104788] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3606563 ] 00:24:01.389 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:01.389 Zero copy mechanism will not be used. 00:24:01.389 [2024-07-16 00:18:48.303287] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:01.647 [2024-07-16 00:18:48.407485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:01.647 [2024-07-16 00:18:48.465045] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:01.647 [2024-07-16 00:18:48.465081] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:01.647 00:18:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:01.647 00:18:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:24:01.647 00:18:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:01.647 00:18:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:01.905 BaseBdev1_malloc 00:24:01.905 00:18:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:02.471 [2024-07-16 00:18:49.319985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:02.471 [2024-07-16 00:18:49.320034] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:02.471 [2024-07-16 00:18:49.320057] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa6ad40 00:24:02.471 [2024-07-16 00:18:49.320070] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:02.471 [2024-07-16 00:18:49.321818] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:02.471 [2024-07-16 00:18:49.321846] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:02.471 BaseBdev1 00:24:02.471 00:18:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:02.471 00:18:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:02.729 BaseBdev2_malloc 00:24:02.729 00:18:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:03.295 [2024-07-16 00:18:50.143844] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:03.295 [2024-07-16 00:18:50.143897] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:03.295 [2024-07-16 00:18:50.143935] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa6b860 00:24:03.295 [2024-07-16 00:18:50.143949] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:03.295 [2024-07-16 00:18:50.145521] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:03.295 [2024-07-16 00:18:50.145549] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:03.295 BaseBdev2 00:24:03.295 00:18:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:03.553 spare_malloc 00:24:03.553 00:18:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:03.810 spare_delay 00:24:03.810 00:18:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:04.067 [2024-07-16 00:18:50.994691] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:04.067 [2024-07-16 00:18:50.994736] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:04.067 [2024-07-16 00:18:50.994757] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc19ec0 00:24:04.067 [2024-07-16 00:18:50.994769] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:04.067 [2024-07-16 00:18:50.996378] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:04.067 [2024-07-16 00:18:50.996406] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:04.067 spare 00:24:04.067 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:04.326 [2024-07-16 00:18:51.239355] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:04.326 [2024-07-16 00:18:51.240744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:04.326 [2024-07-16 00:18:51.240823] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc1b070 00:24:04.326 [2024-07-16 00:18:51.240834] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:04.326 [2024-07-16 00:18:51.241057] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc14490 00:24:04.326 [2024-07-16 00:18:51.241201] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc1b070 00:24:04.326 [2024-07-16 00:18:51.241211] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc1b070 00:24:04.326 [2024-07-16 00:18:51.241333] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.326 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.893 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:04.893 "name": "raid_bdev1", 00:24:04.893 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:04.893 "strip_size_kb": 0, 00:24:04.893 "state": "online", 00:24:04.893 "raid_level": "raid1", 00:24:04.893 "superblock": false, 00:24:04.893 "num_base_bdevs": 2, 00:24:04.893 "num_base_bdevs_discovered": 2, 00:24:04.893 "num_base_bdevs_operational": 2, 00:24:04.893 "base_bdevs_list": [ 00:24:04.893 { 00:24:04.893 "name": "BaseBdev1", 00:24:04.893 "uuid": "a0c57a39-5c68-5c09-9247-5e93ec136b48", 00:24:04.893 "is_configured": true, 00:24:04.893 "data_offset": 0, 00:24:04.893 "data_size": 65536 00:24:04.893 }, 00:24:04.893 { 00:24:04.893 "name": "BaseBdev2", 00:24:04.893 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:04.893 "is_configured": true, 00:24:04.893 "data_offset": 0, 00:24:04.893 "data_size": 65536 00:24:04.893 } 00:24:04.893 ] 00:24:04.893 }' 00:24:04.893 00:18:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:04.893 00:18:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:05.458 00:18:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:05.458 00:18:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:05.716 [2024-07-16 00:18:52.611234] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:05.716 00:18:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:05.716 00:18:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.716 00:18:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:05.974 00:18:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:05.974 00:18:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:05.974 00:18:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:05.974 00:18:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:06.232 [2024-07-16 00:18:53.002387] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc15bd0 00:24:06.232 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:06.232 Zero copy mechanism will not be used. 00:24:06.232 Running I/O for 60 seconds... 00:24:06.232 [2024-07-16 00:18:53.118621] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:06.232 [2024-07-16 00:18:53.126781] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xc15bd0 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.232 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.489 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:06.489 "name": "raid_bdev1", 00:24:06.489 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:06.489 "strip_size_kb": 0, 00:24:06.489 "state": "online", 00:24:06.489 "raid_level": "raid1", 00:24:06.489 "superblock": false, 00:24:06.489 "num_base_bdevs": 2, 00:24:06.489 "num_base_bdevs_discovered": 1, 00:24:06.489 "num_base_bdevs_operational": 1, 00:24:06.489 "base_bdevs_list": [ 00:24:06.489 { 00:24:06.489 "name": null, 00:24:06.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.489 "is_configured": false, 00:24:06.489 "data_offset": 0, 00:24:06.489 "data_size": 65536 00:24:06.489 }, 00:24:06.489 { 00:24:06.489 "name": "BaseBdev2", 00:24:06.489 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:06.489 "is_configured": true, 00:24:06.489 "data_offset": 0, 00:24:06.489 "data_size": 65536 00:24:06.489 } 00:24:06.489 ] 00:24:06.489 }' 00:24:06.489 00:18:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:06.489 00:18:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:07.422 00:18:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:07.422 [2024-07-16 00:18:54.294971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:07.422 [2024-07-16 00:18:54.354095] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb9d8b0 00:24:07.422 00:18:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:07.422 [2024-07-16 00:18:54.356410] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:07.681 [2024-07-16 00:18:54.475255] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:07.681 [2024-07-16 00:18:54.475701] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:07.940 [2024-07-16 00:18:54.653934] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:08.508 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:08.508 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:08.508 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:08.508 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:08.508 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:08.508 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.508 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.767 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:08.767 "name": "raid_bdev1", 00:24:08.767 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:08.767 "strip_size_kb": 0, 00:24:08.767 "state": "online", 00:24:08.767 "raid_level": "raid1", 00:24:08.767 "superblock": false, 00:24:08.767 "num_base_bdevs": 2, 00:24:08.767 "num_base_bdevs_discovered": 2, 00:24:08.767 "num_base_bdevs_operational": 2, 00:24:08.767 "process": { 00:24:08.767 "type": "rebuild", 00:24:08.767 "target": "spare", 00:24:08.767 "progress": { 00:24:08.767 "blocks": 16384, 00:24:08.767 "percent": 25 00:24:08.767 } 00:24:08.767 }, 00:24:08.767 "base_bdevs_list": [ 00:24:08.767 { 00:24:08.767 "name": "spare", 00:24:08.767 "uuid": "6722238c-7a88-5131-b699-3c2c125afe96", 00:24:08.767 "is_configured": true, 00:24:08.767 "data_offset": 0, 00:24:08.767 "data_size": 65536 00:24:08.767 }, 00:24:08.767 { 00:24:08.767 "name": "BaseBdev2", 00:24:08.767 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:08.767 "is_configured": true, 00:24:08.767 "data_offset": 0, 00:24:08.767 "data_size": 65536 00:24:08.767 } 00:24:08.767 ] 00:24:08.767 }' 00:24:08.767 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:08.767 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:08.767 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.025 [2024-07-16 00:18:55.721102] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:09.025 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:09.025 00:18:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:09.025 [2024-07-16 00:18:55.900255] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:09.283 [2024-07-16 00:18:55.996885] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:09.283 [2024-07-16 00:18:56.128850] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:09.283 [2024-07-16 00:18:56.130453] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:09.283 [2024-07-16 00:18:56.130481] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:09.283 [2024-07-16 00:18:56.130492] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:09.283 [2024-07-16 00:18:56.152369] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xc15bd0 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.283 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.542 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:09.543 "name": "raid_bdev1", 00:24:09.543 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:09.543 "strip_size_kb": 0, 00:24:09.543 "state": "online", 00:24:09.543 "raid_level": "raid1", 00:24:09.543 "superblock": false, 00:24:09.543 "num_base_bdevs": 2, 00:24:09.543 "num_base_bdevs_discovered": 1, 00:24:09.543 "num_base_bdevs_operational": 1, 00:24:09.543 "base_bdevs_list": [ 00:24:09.543 { 00:24:09.543 "name": null, 00:24:09.543 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.543 "is_configured": false, 00:24:09.543 "data_offset": 0, 00:24:09.543 "data_size": 65536 00:24:09.543 }, 00:24:09.543 { 00:24:09.543 "name": "BaseBdev2", 00:24:09.543 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:09.543 "is_configured": true, 00:24:09.543 "data_offset": 0, 00:24:09.543 "data_size": 65536 00:24:09.543 } 00:24:09.543 ] 00:24:09.543 }' 00:24:09.543 00:18:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:09.543 00:18:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:10.110 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:10.110 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.110 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:10.110 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:10.110 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.369 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.369 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.629 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.629 "name": "raid_bdev1", 00:24:10.629 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:10.629 "strip_size_kb": 0, 00:24:10.629 "state": "online", 00:24:10.629 "raid_level": "raid1", 00:24:10.629 "superblock": false, 00:24:10.629 "num_base_bdevs": 2, 00:24:10.629 "num_base_bdevs_discovered": 1, 00:24:10.629 "num_base_bdevs_operational": 1, 00:24:10.629 "base_bdevs_list": [ 00:24:10.629 { 00:24:10.629 "name": null, 00:24:10.629 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.629 "is_configured": false, 00:24:10.629 "data_offset": 0, 00:24:10.629 "data_size": 65536 00:24:10.629 }, 00:24:10.629 { 00:24:10.629 "name": "BaseBdev2", 00:24:10.629 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:10.629 "is_configured": true, 00:24:10.629 "data_offset": 0, 00:24:10.629 "data_size": 65536 00:24:10.629 } 00:24:10.629 ] 00:24:10.629 }' 00:24:10.629 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.629 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:10.629 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.629 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:10.629 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:10.888 [2024-07-16 00:18:57.647377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:10.888 00:18:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:10.888 [2024-07-16 00:18:57.723908] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc1b450 00:24:10.888 [2024-07-16 00:18:57.725399] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:11.147 [2024-07-16 00:18:57.856935] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:11.147 [2024-07-16 00:18:57.984353] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:11.147 [2024-07-16 00:18:57.984524] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:11.406 [2024-07-16 00:18:58.238061] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:11.406 [2024-07-16 00:18:58.238372] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:11.666 [2024-07-16 00:18:58.449617] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:11.925 00:18:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:11.925 00:18:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.925 00:18:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:11.925 00:18:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:11.925 00:18:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.925 00:18:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.925 00:18:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.925 [2024-07-16 00:18:58.829264] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:12.210 00:18:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:12.210 "name": "raid_bdev1", 00:24:12.210 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:12.210 "strip_size_kb": 0, 00:24:12.210 "state": "online", 00:24:12.210 "raid_level": "raid1", 00:24:12.210 "superblock": false, 00:24:12.210 "num_base_bdevs": 2, 00:24:12.210 "num_base_bdevs_discovered": 2, 00:24:12.210 "num_base_bdevs_operational": 2, 00:24:12.210 "process": { 00:24:12.210 "type": "rebuild", 00:24:12.210 "target": "spare", 00:24:12.210 "progress": { 00:24:12.210 "blocks": 14336, 00:24:12.210 "percent": 21 00:24:12.210 } 00:24:12.210 }, 00:24:12.210 "base_bdevs_list": [ 00:24:12.210 { 00:24:12.210 "name": "spare", 00:24:12.210 "uuid": "6722238c-7a88-5131-b699-3c2c125afe96", 00:24:12.210 "is_configured": true, 00:24:12.210 "data_offset": 0, 00:24:12.210 "data_size": 65536 00:24:12.210 }, 00:24:12.210 { 00:24:12.210 "name": "BaseBdev2", 00:24:12.210 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:12.210 "is_configured": true, 00:24:12.210 "data_offset": 0, 00:24:12.210 "data_size": 65536 00:24:12.210 } 00:24:12.210 ] 00:24:12.210 }' 00:24:12.210 00:18:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.210 [2024-07-16 00:18:59.050407] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:12.210 [2024-07-16 00:18:59.050596] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=849 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.210 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.493 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:12.493 "name": "raid_bdev1", 00:24:12.493 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:12.493 "strip_size_kb": 0, 00:24:12.493 "state": "online", 00:24:12.493 "raid_level": "raid1", 00:24:12.493 "superblock": false, 00:24:12.493 "num_base_bdevs": 2, 00:24:12.493 "num_base_bdevs_discovered": 2, 00:24:12.493 "num_base_bdevs_operational": 2, 00:24:12.493 "process": { 00:24:12.493 "type": "rebuild", 00:24:12.493 "target": "spare", 00:24:12.493 "progress": { 00:24:12.493 "blocks": 18432, 00:24:12.493 "percent": 28 00:24:12.493 } 00:24:12.493 }, 00:24:12.493 "base_bdevs_list": [ 00:24:12.493 { 00:24:12.493 "name": "spare", 00:24:12.493 "uuid": "6722238c-7a88-5131-b699-3c2c125afe96", 00:24:12.493 "is_configured": true, 00:24:12.493 "data_offset": 0, 00:24:12.493 "data_size": 65536 00:24:12.493 }, 00:24:12.493 { 00:24:12.493 "name": "BaseBdev2", 00:24:12.493 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:12.493 "is_configured": true, 00:24:12.493 "data_offset": 0, 00:24:12.493 "data_size": 65536 00:24:12.493 } 00:24:12.493 ] 00:24:12.493 }' 00:24:12.493 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.493 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:12.493 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.493 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:12.493 00:18:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:12.752 [2024-07-16 00:18:59.645171] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:12.753 [2024-07-16 00:18:59.645569] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:13.012 [2024-07-16 00:18:59.874085] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:13.271 [2024-07-16 00:19:00.178488] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:13.271 [2024-07-16 00:19:00.178732] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:13.530 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:13.530 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:13.530 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:13.530 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:13.530 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:13.530 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:13.530 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.530 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.789 [2024-07-16 00:19:00.619480] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:24:13.789 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:13.789 "name": "raid_bdev1", 00:24:13.789 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:13.789 "strip_size_kb": 0, 00:24:13.789 "state": "online", 00:24:13.789 "raid_level": "raid1", 00:24:13.789 "superblock": false, 00:24:13.789 "num_base_bdevs": 2, 00:24:13.789 "num_base_bdevs_discovered": 2, 00:24:13.789 "num_base_bdevs_operational": 2, 00:24:13.789 "process": { 00:24:13.789 "type": "rebuild", 00:24:13.789 "target": "spare", 00:24:13.789 "progress": { 00:24:13.789 "blocks": 40960, 00:24:13.789 "percent": 62 00:24:13.789 } 00:24:13.789 }, 00:24:13.789 "base_bdevs_list": [ 00:24:13.789 { 00:24:13.789 "name": "spare", 00:24:13.789 "uuid": "6722238c-7a88-5131-b699-3c2c125afe96", 00:24:13.789 "is_configured": true, 00:24:13.789 "data_offset": 0, 00:24:13.789 "data_size": 65536 00:24:13.789 }, 00:24:13.789 { 00:24:13.789 "name": "BaseBdev2", 00:24:13.789 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:13.789 "is_configured": true, 00:24:13.789 "data_offset": 0, 00:24:13.789 "data_size": 65536 00:24:13.789 } 00:24:13.789 ] 00:24:13.789 }' 00:24:13.789 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:13.789 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:13.789 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:14.049 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:14.049 00:19:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:14.988 [2024-07-16 00:19:01.624204] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:24:14.988 00:19:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:14.988 00:19:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:14.988 00:19:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:14.988 00:19:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:14.988 00:19:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:14.988 00:19:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:14.988 00:19:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.988 00:19:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.248 [2024-07-16 00:19:01.974389] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:15.248 00:19:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.248 "name": "raid_bdev1", 00:24:15.248 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:15.248 "strip_size_kb": 0, 00:24:15.248 "state": "online", 00:24:15.248 "raid_level": "raid1", 00:24:15.248 "superblock": false, 00:24:15.248 "num_base_bdevs": 2, 00:24:15.248 "num_base_bdevs_discovered": 2, 00:24:15.248 "num_base_bdevs_operational": 2, 00:24:15.248 "process": { 00:24:15.248 "type": "rebuild", 00:24:15.248 "target": "spare", 00:24:15.248 "progress": { 00:24:15.248 "blocks": 65536, 00:24:15.248 "percent": 100 00:24:15.248 } 00:24:15.248 }, 00:24:15.248 "base_bdevs_list": [ 00:24:15.248 { 00:24:15.248 "name": "spare", 00:24:15.248 "uuid": "6722238c-7a88-5131-b699-3c2c125afe96", 00:24:15.248 "is_configured": true, 00:24:15.248 "data_offset": 0, 00:24:15.248 "data_size": 65536 00:24:15.248 }, 00:24:15.248 { 00:24:15.248 "name": "BaseBdev2", 00:24:15.248 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:15.248 "is_configured": true, 00:24:15.248 "data_offset": 0, 00:24:15.248 "data_size": 65536 00:24:15.248 } 00:24:15.248 ] 00:24:15.248 }' 00:24:15.248 00:19:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.248 00:19:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:15.248 00:19:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.248 [2024-07-16 00:19:02.082700] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:15.248 [2024-07-16 00:19:02.085036] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:15.248 00:19:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:15.248 00:19:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:16.186 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:16.186 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:16.186 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:16.186 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:16.186 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:16.186 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:16.186 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.186 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.444 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:16.444 "name": "raid_bdev1", 00:24:16.444 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:16.444 "strip_size_kb": 0, 00:24:16.444 "state": "online", 00:24:16.444 "raid_level": "raid1", 00:24:16.444 "superblock": false, 00:24:16.444 "num_base_bdevs": 2, 00:24:16.444 "num_base_bdevs_discovered": 2, 00:24:16.444 "num_base_bdevs_operational": 2, 00:24:16.444 "base_bdevs_list": [ 00:24:16.444 { 00:24:16.444 "name": "spare", 00:24:16.444 "uuid": "6722238c-7a88-5131-b699-3c2c125afe96", 00:24:16.444 "is_configured": true, 00:24:16.444 "data_offset": 0, 00:24:16.444 "data_size": 65536 00:24:16.444 }, 00:24:16.444 { 00:24:16.444 "name": "BaseBdev2", 00:24:16.444 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:16.444 "is_configured": true, 00:24:16.444 "data_offset": 0, 00:24:16.444 "data_size": 65536 00:24:16.444 } 00:24:16.444 ] 00:24:16.444 }' 00:24:16.444 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:16.444 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:16.726 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:16.726 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:16.726 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:24:16.726 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:16.726 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:16.726 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:16.726 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:16.726 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:16.726 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.726 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:16.986 "name": "raid_bdev1", 00:24:16.986 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:16.986 "strip_size_kb": 0, 00:24:16.986 "state": "online", 00:24:16.986 "raid_level": "raid1", 00:24:16.986 "superblock": false, 00:24:16.986 "num_base_bdevs": 2, 00:24:16.986 "num_base_bdevs_discovered": 2, 00:24:16.986 "num_base_bdevs_operational": 2, 00:24:16.986 "base_bdevs_list": [ 00:24:16.986 { 00:24:16.986 "name": "spare", 00:24:16.986 "uuid": "6722238c-7a88-5131-b699-3c2c125afe96", 00:24:16.986 "is_configured": true, 00:24:16.986 "data_offset": 0, 00:24:16.986 "data_size": 65536 00:24:16.986 }, 00:24:16.986 { 00:24:16.986 "name": "BaseBdev2", 00:24:16.986 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:16.986 "is_configured": true, 00:24:16.986 "data_offset": 0, 00:24:16.986 "data_size": 65536 00:24:16.986 } 00:24:16.986 ] 00:24:16.986 }' 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.986 00:19:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.245 00:19:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.245 "name": "raid_bdev1", 00:24:17.245 "uuid": "671d1cd8-3155-4637-82de-ca73b9dede19", 00:24:17.245 "strip_size_kb": 0, 00:24:17.245 "state": "online", 00:24:17.245 "raid_level": "raid1", 00:24:17.245 "superblock": false, 00:24:17.245 "num_base_bdevs": 2, 00:24:17.245 "num_base_bdevs_discovered": 2, 00:24:17.245 "num_base_bdevs_operational": 2, 00:24:17.245 "base_bdevs_list": [ 00:24:17.245 { 00:24:17.245 "name": "spare", 00:24:17.245 "uuid": "6722238c-7a88-5131-b699-3c2c125afe96", 00:24:17.245 "is_configured": true, 00:24:17.245 "data_offset": 0, 00:24:17.246 "data_size": 65536 00:24:17.246 }, 00:24:17.246 { 00:24:17.246 "name": "BaseBdev2", 00:24:17.246 "uuid": "3477bd02-5ea8-5a12-a6df-94b2820c07d5", 00:24:17.246 "is_configured": true, 00:24:17.246 "data_offset": 0, 00:24:17.246 "data_size": 65536 00:24:17.246 } 00:24:17.246 ] 00:24:17.246 }' 00:24:17.246 00:19:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.246 00:19:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:18.185 00:19:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:18.185 [2024-07-16 00:19:05.105091] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:18.185 [2024-07-16 00:19:05.105126] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:18.444 00:24:18.444 Latency(us) 00:24:18.444 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:18.444 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:18.444 raid_bdev1 : 12.13 100.78 302.35 0.00 0.00 14470.25 304.53 118534.68 00:24:18.444 =================================================================================================================== 00:24:18.444 Total : 100.78 302.35 0.00 0.00 14470.25 304.53 118534.68 00:24:18.444 [2024-07-16 00:19:05.173388] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:18.444 [2024-07-16 00:19:05.173418] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:18.444 [2024-07-16 00:19:05.173490] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:18.445 [2024-07-16 00:19:05.173503] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc1b070 name raid_bdev1, state offline 00:24:18.445 0 00:24:18.445 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.445 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:18.704 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:18.963 /dev/nbd0 00:24:18.963 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:18.964 1+0 records in 00:24:18.964 1+0 records out 00:24:18.964 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307652 s, 13.3 MB/s 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:18.964 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:19.224 /dev/nbd1 00:24:19.224 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:19.224 00:19:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:19.224 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:19.224 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:19.224 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:19.224 00:19:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:19.224 1+0 records in 00:24:19.224 1+0 records out 00:24:19.224 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002949 s, 13.9 MB/s 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:19.224 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:19.484 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 3606563 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 3606563 ']' 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 3606563 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3606563 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:19.743 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3606563' 00:24:19.744 killing process with pid 3606563 00:24:19.744 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 3606563 00:24:19.744 Received shutdown signal, test time was about 13.604159 seconds 00:24:19.744 00:24:19.744 Latency(us) 00:24:19.744 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:19.744 =================================================================================================================== 00:24:19.744 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:19.744 [2024-07-16 00:19:06.642130] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:19.744 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 3606563 00:24:19.744 [2024-07-16 00:19:06.664475] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:20.003 00:19:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:20.003 00:24:20.003 real 0m18.900s 00:24:20.003 user 0m29.463s 00:24:20.003 sys 0m3.049s 00:24:20.003 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:20.003 00:19:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:20.003 ************************************ 00:24:20.003 END TEST raid_rebuild_test_io 00:24:20.003 ************************************ 00:24:20.003 00:19:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:20.003 00:19:06 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:24:20.003 00:19:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:20.003 00:19:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:20.003 00:19:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:20.262 ************************************ 00:24:20.263 START TEST raid_rebuild_test_sb_io 00:24:20.263 ************************************ 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=3609251 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 3609251 /var/tmp/spdk-raid.sock 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 3609251 ']' 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:20.263 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:20.263 00:19:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:20.263 [2024-07-16 00:19:07.051202] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:24:20.263 [2024-07-16 00:19:07.051277] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3609251 ] 00:24:20.263 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:20.263 Zero copy mechanism will not be used. 00:24:20.263 [2024-07-16 00:19:07.180271] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:20.522 [2024-07-16 00:19:07.285202] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:20.522 [2024-07-16 00:19:07.348631] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:20.522 [2024-07-16 00:19:07.348672] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:21.089 00:19:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:21.089 00:19:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:24:21.089 00:19:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:21.089 00:19:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:21.348 BaseBdev1_malloc 00:24:21.348 00:19:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:21.607 [2024-07-16 00:19:08.333805] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:21.607 [2024-07-16 00:19:08.333857] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:21.607 [2024-07-16 00:19:08.333878] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2510d40 00:24:21.607 [2024-07-16 00:19:08.333891] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:21.607 [2024-07-16 00:19:08.335479] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:21.607 [2024-07-16 00:19:08.335509] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:21.607 BaseBdev1 00:24:21.607 00:19:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:21.607 00:19:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:21.607 BaseBdev2_malloc 00:24:21.607 00:19:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:21.866 [2024-07-16 00:19:08.699782] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:21.866 [2024-07-16 00:19:08.699832] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:21.866 [2024-07-16 00:19:08.699856] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2511860 00:24:21.866 [2024-07-16 00:19:08.699868] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:21.866 [2024-07-16 00:19:08.701327] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:21.866 [2024-07-16 00:19:08.701356] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:21.866 BaseBdev2 00:24:21.866 00:19:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:22.125 spare_malloc 00:24:22.125 00:19:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:22.384 spare_delay 00:24:22.384 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:22.384 [2024-07-16 00:19:09.269993] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:22.384 [2024-07-16 00:19:09.270044] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:22.384 [2024-07-16 00:19:09.270064] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26bfec0 00:24:22.384 [2024-07-16 00:19:09.270077] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:22.384 [2024-07-16 00:19:09.271545] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:22.384 [2024-07-16 00:19:09.271573] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:22.384 spare 00:24:22.384 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:22.643 [2024-07-16 00:19:09.530708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:22.643 [2024-07-16 00:19:09.531966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:22.643 [2024-07-16 00:19:09.532137] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c1070 00:24:22.643 [2024-07-16 00:19:09.532150] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:22.643 [2024-07-16 00:19:09.532343] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ba490 00:24:22.643 [2024-07-16 00:19:09.532480] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c1070 00:24:22.643 [2024-07-16 00:19:09.532491] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26c1070 00:24:22.643 [2024-07-16 00:19:09.532586] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.643 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.901 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:22.901 "name": "raid_bdev1", 00:24:22.901 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:22.901 "strip_size_kb": 0, 00:24:22.901 "state": "online", 00:24:22.901 "raid_level": "raid1", 00:24:22.901 "superblock": true, 00:24:22.901 "num_base_bdevs": 2, 00:24:22.901 "num_base_bdevs_discovered": 2, 00:24:22.901 "num_base_bdevs_operational": 2, 00:24:22.901 "base_bdevs_list": [ 00:24:22.901 { 00:24:22.901 "name": "BaseBdev1", 00:24:22.901 "uuid": "4490714a-e54f-5c7f-b930-666404cd5918", 00:24:22.901 "is_configured": true, 00:24:22.901 "data_offset": 2048, 00:24:22.901 "data_size": 63488 00:24:22.901 }, 00:24:22.901 { 00:24:22.901 "name": "BaseBdev2", 00:24:22.901 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:22.901 "is_configured": true, 00:24:22.901 "data_offset": 2048, 00:24:22.901 "data_size": 63488 00:24:22.901 } 00:24:22.901 ] 00:24:22.901 }' 00:24:22.901 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:22.901 00:19:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:23.468 00:19:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:23.468 00:19:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:23.726 [2024-07-16 00:19:10.505529] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:23.726 00:19:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:23.726 00:19:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.726 00:19:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:23.985 00:19:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:23.985 00:19:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:23.985 00:19:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:23.985 00:19:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:23.985 [2024-07-16 00:19:10.888375] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26c1c50 00:24:23.985 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:23.985 Zero copy mechanism will not be used. 00:24:23.985 Running I/O for 60 seconds... 00:24:24.244 [2024-07-16 00:19:11.015277] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:24.244 [2024-07-16 00:19:11.031739] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x26c1c50 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.244 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.503 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:24.503 "name": "raid_bdev1", 00:24:24.503 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:24.503 "strip_size_kb": 0, 00:24:24.503 "state": "online", 00:24:24.503 "raid_level": "raid1", 00:24:24.503 "superblock": true, 00:24:24.503 "num_base_bdevs": 2, 00:24:24.503 "num_base_bdevs_discovered": 1, 00:24:24.503 "num_base_bdevs_operational": 1, 00:24:24.503 "base_bdevs_list": [ 00:24:24.503 { 00:24:24.503 "name": null, 00:24:24.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.503 "is_configured": false, 00:24:24.503 "data_offset": 2048, 00:24:24.503 "data_size": 63488 00:24:24.503 }, 00:24:24.503 { 00:24:24.503 "name": "BaseBdev2", 00:24:24.503 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:24.503 "is_configured": true, 00:24:24.503 "data_offset": 2048, 00:24:24.503 "data_size": 63488 00:24:24.503 } 00:24:24.503 ] 00:24:24.503 }' 00:24:24.503 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:24.503 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:25.072 00:19:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:25.337 [2024-07-16 00:19:12.140510] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:25.337 [2024-07-16 00:19:12.192326] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2644190 00:24:25.337 00:19:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:25.337 [2024-07-16 00:19:12.194871] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:25.596 [2024-07-16 00:19:12.314683] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:25.596 [2024-07-16 00:19:12.315148] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:25.855 [2024-07-16 00:19:12.568858] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:25.855 [2024-07-16 00:19:12.569173] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:26.113 [2024-07-16 00:19:12.807405] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:26.113 [2024-07-16 00:19:12.807692] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:26.113 [2024-07-16 00:19:12.918838] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:26.113 [2024-07-16 00:19:12.919073] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:26.371 [2024-07-16 00:19:13.191451] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:26.371 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:26.371 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:26.371 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:26.371 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:26.371 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:26.371 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.371 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.628 [2024-07-16 00:19:13.401174] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:26.628 [2024-07-16 00:19:13.401396] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:26.628 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:26.628 "name": "raid_bdev1", 00:24:26.628 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:26.628 "strip_size_kb": 0, 00:24:26.628 "state": "online", 00:24:26.628 "raid_level": "raid1", 00:24:26.628 "superblock": true, 00:24:26.628 "num_base_bdevs": 2, 00:24:26.628 "num_base_bdevs_discovered": 2, 00:24:26.628 "num_base_bdevs_operational": 2, 00:24:26.628 "process": { 00:24:26.628 "type": "rebuild", 00:24:26.628 "target": "spare", 00:24:26.628 "progress": { 00:24:26.628 "blocks": 16384, 00:24:26.628 "percent": 25 00:24:26.628 } 00:24:26.629 }, 00:24:26.629 "base_bdevs_list": [ 00:24:26.629 { 00:24:26.629 "name": "spare", 00:24:26.629 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:26.629 "is_configured": true, 00:24:26.629 "data_offset": 2048, 00:24:26.629 "data_size": 63488 00:24:26.629 }, 00:24:26.629 { 00:24:26.629 "name": "BaseBdev2", 00:24:26.629 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:26.629 "is_configured": true, 00:24:26.629 "data_offset": 2048, 00:24:26.629 "data_size": 63488 00:24:26.629 } 00:24:26.629 ] 00:24:26.629 }' 00:24:26.629 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:26.629 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:26.629 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:26.629 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:26.629 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:26.887 [2024-07-16 00:19:13.731685] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:26.887 [2024-07-16 00:19:13.760191] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:27.146 [2024-07-16 00:19:13.869603] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:27.146 [2024-07-16 00:19:13.879414] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:27.146 [2024-07-16 00:19:13.879443] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:27.146 [2024-07-16 00:19:13.879453] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:27.146 [2024-07-16 00:19:13.894409] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x26c1c50 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.146 00:19:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.405 00:19:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:27.405 "name": "raid_bdev1", 00:24:27.405 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:27.405 "strip_size_kb": 0, 00:24:27.405 "state": "online", 00:24:27.405 "raid_level": "raid1", 00:24:27.405 "superblock": true, 00:24:27.405 "num_base_bdevs": 2, 00:24:27.405 "num_base_bdevs_discovered": 1, 00:24:27.405 "num_base_bdevs_operational": 1, 00:24:27.405 "base_bdevs_list": [ 00:24:27.405 { 00:24:27.405 "name": null, 00:24:27.405 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.405 "is_configured": false, 00:24:27.405 "data_offset": 2048, 00:24:27.405 "data_size": 63488 00:24:27.405 }, 00:24:27.405 { 00:24:27.405 "name": "BaseBdev2", 00:24:27.405 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:27.405 "is_configured": true, 00:24:27.405 "data_offset": 2048, 00:24:27.405 "data_size": 63488 00:24:27.405 } 00:24:27.405 ] 00:24:27.405 }' 00:24:27.405 00:19:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:27.405 00:19:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:27.984 00:19:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:27.984 00:19:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.984 00:19:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:27.984 00:19:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:27.984 00:19:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.984 00:19:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.984 00:19:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.242 00:19:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:28.242 "name": "raid_bdev1", 00:24:28.242 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:28.242 "strip_size_kb": 0, 00:24:28.242 "state": "online", 00:24:28.242 "raid_level": "raid1", 00:24:28.242 "superblock": true, 00:24:28.242 "num_base_bdevs": 2, 00:24:28.242 "num_base_bdevs_discovered": 1, 00:24:28.242 "num_base_bdevs_operational": 1, 00:24:28.242 "base_bdevs_list": [ 00:24:28.242 { 00:24:28.242 "name": null, 00:24:28.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:28.242 "is_configured": false, 00:24:28.242 "data_offset": 2048, 00:24:28.242 "data_size": 63488 00:24:28.242 }, 00:24:28.242 { 00:24:28.242 "name": "BaseBdev2", 00:24:28.242 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:28.242 "is_configured": true, 00:24:28.242 "data_offset": 2048, 00:24:28.242 "data_size": 63488 00:24:28.242 } 00:24:28.242 ] 00:24:28.242 }' 00:24:28.242 00:19:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:28.242 00:19:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:28.242 00:19:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:28.500 00:19:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:28.501 00:19:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:28.501 [2024-07-16 00:19:15.440790] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:28.758 00:19:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:28.758 [2024-07-16 00:19:15.491639] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2647f40 00:24:28.758 [2024-07-16 00:19:15.493101] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:28.758 [2024-07-16 00:19:15.641734] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:29.015 [2024-07-16 00:19:15.769356] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:29.015 [2024-07-16 00:19:15.769557] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:29.273 [2024-07-16 00:19:16.204971] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:29.273 [2024-07-16 00:19:16.205164] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:29.841 "name": "raid_bdev1", 00:24:29.841 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:29.841 "strip_size_kb": 0, 00:24:29.841 "state": "online", 00:24:29.841 "raid_level": "raid1", 00:24:29.841 "superblock": true, 00:24:29.841 "num_base_bdevs": 2, 00:24:29.841 "num_base_bdevs_discovered": 2, 00:24:29.841 "num_base_bdevs_operational": 2, 00:24:29.841 "process": { 00:24:29.841 "type": "rebuild", 00:24:29.841 "target": "spare", 00:24:29.841 "progress": { 00:24:29.841 "blocks": 18432, 00:24:29.841 "percent": 29 00:24:29.841 } 00:24:29.841 }, 00:24:29.841 "base_bdevs_list": [ 00:24:29.841 { 00:24:29.841 "name": "spare", 00:24:29.841 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:29.841 "is_configured": true, 00:24:29.841 "data_offset": 2048, 00:24:29.841 "data_size": 63488 00:24:29.841 }, 00:24:29.841 { 00:24:29.841 "name": "BaseBdev2", 00:24:29.841 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:29.841 "is_configured": true, 00:24:29.841 "data_offset": 2048, 00:24:29.841 "data_size": 63488 00:24:29.841 } 00:24:29.841 ] 00:24:29.841 }' 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:29.841 [2024-07-16 00:19:16.783794] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:29.841 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:30.101 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=866 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.101 00:19:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.101 [2024-07-16 00:19:17.011009] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:30.360 00:19:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:30.360 "name": "raid_bdev1", 00:24:30.360 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:30.360 "strip_size_kb": 0, 00:24:30.360 "state": "online", 00:24:30.360 "raid_level": "raid1", 00:24:30.360 "superblock": true, 00:24:30.360 "num_base_bdevs": 2, 00:24:30.360 "num_base_bdevs_discovered": 2, 00:24:30.360 "num_base_bdevs_operational": 2, 00:24:30.360 "process": { 00:24:30.360 "type": "rebuild", 00:24:30.360 "target": "spare", 00:24:30.360 "progress": { 00:24:30.360 "blocks": 22528, 00:24:30.360 "percent": 35 00:24:30.360 } 00:24:30.360 }, 00:24:30.360 "base_bdevs_list": [ 00:24:30.360 { 00:24:30.360 "name": "spare", 00:24:30.360 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:30.360 "is_configured": true, 00:24:30.360 "data_offset": 2048, 00:24:30.360 "data_size": 63488 00:24:30.360 }, 00:24:30.360 { 00:24:30.360 "name": "BaseBdev2", 00:24:30.360 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:30.360 "is_configured": true, 00:24:30.360 "data_offset": 2048, 00:24:30.360 "data_size": 63488 00:24:30.360 } 00:24:30.360 ] 00:24:30.360 }' 00:24:30.360 00:19:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:30.360 00:19:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:30.360 00:19:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:30.360 00:19:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:30.360 00:19:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:30.360 [2024-07-16 00:19:17.248132] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:30.619 [2024-07-16 00:19:17.462396] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:31.188 [2024-07-16 00:19:17.844141] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:24:31.188 [2024-07-16 00:19:17.964189] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:31.447 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:31.447 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:31.448 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:31.448 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:31.448 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:31.448 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:31.448 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.448 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.747 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:31.747 "name": "raid_bdev1", 00:24:31.747 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:31.747 "strip_size_kb": 0, 00:24:31.747 "state": "online", 00:24:31.747 "raid_level": "raid1", 00:24:31.747 "superblock": true, 00:24:31.747 "num_base_bdevs": 2, 00:24:31.747 "num_base_bdevs_discovered": 2, 00:24:31.747 "num_base_bdevs_operational": 2, 00:24:31.747 "process": { 00:24:31.747 "type": "rebuild", 00:24:31.747 "target": "spare", 00:24:31.747 "progress": { 00:24:31.747 "blocks": 38912, 00:24:31.747 "percent": 61 00:24:31.747 } 00:24:31.747 }, 00:24:31.747 "base_bdevs_list": [ 00:24:31.747 { 00:24:31.747 "name": "spare", 00:24:31.747 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:31.747 "is_configured": true, 00:24:31.747 "data_offset": 2048, 00:24:31.747 "data_size": 63488 00:24:31.747 }, 00:24:31.747 { 00:24:31.747 "name": "BaseBdev2", 00:24:31.747 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:31.747 "is_configured": true, 00:24:31.747 "data_offset": 2048, 00:24:31.747 "data_size": 63488 00:24:31.747 } 00:24:31.747 ] 00:24:31.747 }' 00:24:31.747 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:31.747 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:31.747 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:31.747 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:31.747 00:19:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:32.006 [2024-07-16 00:19:18.797697] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:24:32.266 [2024-07-16 00:19:19.026462] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:24:32.525 [2024-07-16 00:19:19.236856] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:24:32.785 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:32.785 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:32.785 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:32.785 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:32.785 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:32.785 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:32.785 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.785 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.044 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.044 "name": "raid_bdev1", 00:24:33.044 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:33.044 "strip_size_kb": 0, 00:24:33.044 "state": "online", 00:24:33.044 "raid_level": "raid1", 00:24:33.044 "superblock": true, 00:24:33.044 "num_base_bdevs": 2, 00:24:33.044 "num_base_bdevs_discovered": 2, 00:24:33.044 "num_base_bdevs_operational": 2, 00:24:33.044 "process": { 00:24:33.044 "type": "rebuild", 00:24:33.044 "target": "spare", 00:24:33.044 "progress": { 00:24:33.044 "blocks": 59392, 00:24:33.044 "percent": 93 00:24:33.044 } 00:24:33.044 }, 00:24:33.044 "base_bdevs_list": [ 00:24:33.044 { 00:24:33.044 "name": "spare", 00:24:33.044 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:33.044 "is_configured": true, 00:24:33.044 "data_offset": 2048, 00:24:33.044 "data_size": 63488 00:24:33.044 }, 00:24:33.044 { 00:24:33.044 "name": "BaseBdev2", 00:24:33.044 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:33.044 "is_configured": true, 00:24:33.044 "data_offset": 2048, 00:24:33.044 "data_size": 63488 00:24:33.044 } 00:24:33.044 ] 00:24:33.044 }' 00:24:33.044 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.044 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:33.044 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.044 [2024-07-16 00:19:19.843381] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:33.044 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:33.044 00:19:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:33.044 [2024-07-16 00:19:19.921129] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:33.044 [2024-07-16 00:19:19.922765] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:33.981 00:19:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:33.981 00:19:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:33.981 00:19:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.981 00:19:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:33.981 00:19:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:33.981 00:19:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.981 00:19:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.981 00:19:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.240 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:34.240 "name": "raid_bdev1", 00:24:34.240 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:34.240 "strip_size_kb": 0, 00:24:34.240 "state": "online", 00:24:34.240 "raid_level": "raid1", 00:24:34.240 "superblock": true, 00:24:34.240 "num_base_bdevs": 2, 00:24:34.240 "num_base_bdevs_discovered": 2, 00:24:34.240 "num_base_bdevs_operational": 2, 00:24:34.240 "base_bdevs_list": [ 00:24:34.240 { 00:24:34.240 "name": "spare", 00:24:34.240 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:34.240 "is_configured": true, 00:24:34.240 "data_offset": 2048, 00:24:34.240 "data_size": 63488 00:24:34.240 }, 00:24:34.240 { 00:24:34.240 "name": "BaseBdev2", 00:24:34.240 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:34.240 "is_configured": true, 00:24:34.240 "data_offset": 2048, 00:24:34.240 "data_size": 63488 00:24:34.240 } 00:24:34.240 ] 00:24:34.240 }' 00:24:34.240 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:34.240 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:34.240 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:34.499 "name": "raid_bdev1", 00:24:34.499 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:34.499 "strip_size_kb": 0, 00:24:34.499 "state": "online", 00:24:34.499 "raid_level": "raid1", 00:24:34.499 "superblock": true, 00:24:34.499 "num_base_bdevs": 2, 00:24:34.499 "num_base_bdevs_discovered": 2, 00:24:34.499 "num_base_bdevs_operational": 2, 00:24:34.499 "base_bdevs_list": [ 00:24:34.499 { 00:24:34.499 "name": "spare", 00:24:34.499 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:34.499 "is_configured": true, 00:24:34.499 "data_offset": 2048, 00:24:34.499 "data_size": 63488 00:24:34.499 }, 00:24:34.499 { 00:24:34.499 "name": "BaseBdev2", 00:24:34.499 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:34.499 "is_configured": true, 00:24:34.499 "data_offset": 2048, 00:24:34.499 "data_size": 63488 00:24:34.499 } 00:24:34.499 ] 00:24:34.499 }' 00:24:34.499 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.758 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.017 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:35.017 "name": "raid_bdev1", 00:24:35.017 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:35.017 "strip_size_kb": 0, 00:24:35.017 "state": "online", 00:24:35.017 "raid_level": "raid1", 00:24:35.017 "superblock": true, 00:24:35.017 "num_base_bdevs": 2, 00:24:35.017 "num_base_bdevs_discovered": 2, 00:24:35.017 "num_base_bdevs_operational": 2, 00:24:35.017 "base_bdevs_list": [ 00:24:35.017 { 00:24:35.017 "name": "spare", 00:24:35.017 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:35.017 "is_configured": true, 00:24:35.017 "data_offset": 2048, 00:24:35.017 "data_size": 63488 00:24:35.017 }, 00:24:35.017 { 00:24:35.017 "name": "BaseBdev2", 00:24:35.017 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:35.017 "is_configured": true, 00:24:35.017 "data_offset": 2048, 00:24:35.017 "data_size": 63488 00:24:35.017 } 00:24:35.017 ] 00:24:35.017 }' 00:24:35.017 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:35.017 00:19:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:35.584 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:35.843 [2024-07-16 00:19:22.614019] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:35.843 [2024-07-16 00:19:22.614058] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:35.843 00:24:35.843 Latency(us) 00:24:35.843 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:35.843 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:35.843 raid_bdev1 : 11.75 105.49 316.47 0.00 0.00 13052.10 284.94 118534.68 00:24:35.843 =================================================================================================================== 00:24:35.843 Total : 105.49 316.47 0.00 0.00 13052.10 284.94 118534.68 00:24:35.843 [2024-07-16 00:19:22.678221] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:35.843 [2024-07-16 00:19:22.678251] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:35.843 [2024-07-16 00:19:22.678325] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:35.843 [2024-07-16 00:19:22.678338] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c1070 name raid_bdev1, state offline 00:24:35.843 0 00:24:35.843 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.843 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:36.100 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:36.100 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:36.100 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:36.100 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:36.100 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:36.101 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:36.101 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:36.101 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:36.101 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:36.101 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:36.101 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:36.101 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:36.101 00:19:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:36.359 /dev/nbd0 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:36.359 1+0 records in 00:24:36.359 1+0 records out 00:24:36.359 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000316181 s, 13.0 MB/s 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:36.359 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:36.616 /dev/nbd1 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:36.616 1+0 records in 00:24:36.616 1+0 records out 00:24:36.616 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284759 s, 14.4 MB/s 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:36.616 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:36.874 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:36.874 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:36.874 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:36.874 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:36.874 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:36.874 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:36.874 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:37.132 00:19:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:37.389 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:37.389 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:37.389 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:37.389 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:37.389 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:37.389 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:37.389 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:37.389 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:37.389 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:37.389 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:37.647 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:37.905 [2024-07-16 00:19:24.635670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:37.905 [2024-07-16 00:19:24.635721] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:37.905 [2024-07-16 00:19:24.635742] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2510490 00:24:37.905 [2024-07-16 00:19:24.635755] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:37.905 [2024-07-16 00:19:24.637414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:37.905 [2024-07-16 00:19:24.637444] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:37.905 [2024-07-16 00:19:24.637529] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:37.905 [2024-07-16 00:19:24.637555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:37.905 [2024-07-16 00:19:24.637657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:37.905 spare 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.905 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:37.905 [2024-07-16 00:19:24.737976] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x250fb60 00:24:37.905 [2024-07-16 00:19:24.738003] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:37.905 [2024-07-16 00:19:24.738192] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26bde80 00:24:37.905 [2024-07-16 00:19:24.738344] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x250fb60 00:24:37.905 [2024-07-16 00:19:24.738354] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x250fb60 00:24:37.905 [2024-07-16 00:19:24.738462] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:38.163 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:38.163 "name": "raid_bdev1", 00:24:38.163 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:38.163 "strip_size_kb": 0, 00:24:38.163 "state": "online", 00:24:38.163 "raid_level": "raid1", 00:24:38.163 "superblock": true, 00:24:38.163 "num_base_bdevs": 2, 00:24:38.163 "num_base_bdevs_discovered": 2, 00:24:38.163 "num_base_bdevs_operational": 2, 00:24:38.163 "base_bdevs_list": [ 00:24:38.163 { 00:24:38.163 "name": "spare", 00:24:38.163 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:38.163 "is_configured": true, 00:24:38.163 "data_offset": 2048, 00:24:38.163 "data_size": 63488 00:24:38.163 }, 00:24:38.163 { 00:24:38.163 "name": "BaseBdev2", 00:24:38.163 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:38.163 "is_configured": true, 00:24:38.163 "data_offset": 2048, 00:24:38.163 "data_size": 63488 00:24:38.163 } 00:24:38.163 ] 00:24:38.163 }' 00:24:38.163 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:38.163 00:19:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:38.730 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:38.730 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.730 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:38.730 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:38.730 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.730 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.730 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.987 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.987 "name": "raid_bdev1", 00:24:38.987 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:38.987 "strip_size_kb": 0, 00:24:38.987 "state": "online", 00:24:38.987 "raid_level": "raid1", 00:24:38.987 "superblock": true, 00:24:38.987 "num_base_bdevs": 2, 00:24:38.987 "num_base_bdevs_discovered": 2, 00:24:38.987 "num_base_bdevs_operational": 2, 00:24:38.987 "base_bdevs_list": [ 00:24:38.987 { 00:24:38.987 "name": "spare", 00:24:38.987 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:38.987 "is_configured": true, 00:24:38.987 "data_offset": 2048, 00:24:38.987 "data_size": 63488 00:24:38.987 }, 00:24:38.987 { 00:24:38.987 "name": "BaseBdev2", 00:24:38.988 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:38.988 "is_configured": true, 00:24:38.988 "data_offset": 2048, 00:24:38.988 "data_size": 63488 00:24:38.988 } 00:24:38.988 ] 00:24:38.988 }' 00:24:38.988 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.988 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:38.988 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.988 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:38.988 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.988 00:19:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:39.246 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:39.246 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:39.504 [2024-07-16 00:19:26.344529] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.504 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.764 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.764 "name": "raid_bdev1", 00:24:39.764 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:39.764 "strip_size_kb": 0, 00:24:39.764 "state": "online", 00:24:39.764 "raid_level": "raid1", 00:24:39.764 "superblock": true, 00:24:39.764 "num_base_bdevs": 2, 00:24:39.764 "num_base_bdevs_discovered": 1, 00:24:39.764 "num_base_bdevs_operational": 1, 00:24:39.764 "base_bdevs_list": [ 00:24:39.764 { 00:24:39.764 "name": null, 00:24:39.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.764 "is_configured": false, 00:24:39.764 "data_offset": 2048, 00:24:39.764 "data_size": 63488 00:24:39.764 }, 00:24:39.764 { 00:24:39.764 "name": "BaseBdev2", 00:24:39.764 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:39.764 "is_configured": true, 00:24:39.764 "data_offset": 2048, 00:24:39.764 "data_size": 63488 00:24:39.764 } 00:24:39.764 ] 00:24:39.764 }' 00:24:39.764 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.764 00:19:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:40.331 00:19:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:40.589 [2024-07-16 00:19:27.439594] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:40.589 [2024-07-16 00:19:27.439751] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:40.589 [2024-07-16 00:19:27.439767] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:40.589 [2024-07-16 00:19:27.439796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:40.589 [2024-07-16 00:19:27.445066] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26c33d0 00:24:40.589 [2024-07-16 00:19:27.447385] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:40.589 00:19:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:41.524 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:41.524 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:41.524 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:41.524 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:41.524 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:41.524 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.782 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.782 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.782 "name": "raid_bdev1", 00:24:41.782 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:41.782 "strip_size_kb": 0, 00:24:41.782 "state": "online", 00:24:41.782 "raid_level": "raid1", 00:24:41.782 "superblock": true, 00:24:41.782 "num_base_bdevs": 2, 00:24:41.782 "num_base_bdevs_discovered": 2, 00:24:41.782 "num_base_bdevs_operational": 2, 00:24:41.782 "process": { 00:24:41.782 "type": "rebuild", 00:24:41.782 "target": "spare", 00:24:41.782 "progress": { 00:24:41.782 "blocks": 24576, 00:24:41.782 "percent": 38 00:24:41.782 } 00:24:41.782 }, 00:24:41.782 "base_bdevs_list": [ 00:24:41.782 { 00:24:41.782 "name": "spare", 00:24:41.782 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:41.782 "is_configured": true, 00:24:41.782 "data_offset": 2048, 00:24:41.782 "data_size": 63488 00:24:41.782 }, 00:24:41.782 { 00:24:41.782 "name": "BaseBdev2", 00:24:41.782 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:41.782 "is_configured": true, 00:24:41.782 "data_offset": 2048, 00:24:41.782 "data_size": 63488 00:24:41.782 } 00:24:41.782 ] 00:24:41.782 }' 00:24:41.782 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:42.040 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:42.040 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:42.040 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:42.040 00:19:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:42.297 [2024-07-16 00:19:29.043977] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:42.297 [2024-07-16 00:19:29.059944] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:42.297 [2024-07-16 00:19:29.059988] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:42.297 [2024-07-16 00:19:29.060003] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:42.297 [2024-07-16 00:19:29.060011] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.297 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.555 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:42.555 "name": "raid_bdev1", 00:24:42.555 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:42.555 "strip_size_kb": 0, 00:24:42.555 "state": "online", 00:24:42.555 "raid_level": "raid1", 00:24:42.555 "superblock": true, 00:24:42.555 "num_base_bdevs": 2, 00:24:42.555 "num_base_bdevs_discovered": 1, 00:24:42.555 "num_base_bdevs_operational": 1, 00:24:42.555 "base_bdevs_list": [ 00:24:42.555 { 00:24:42.555 "name": null, 00:24:42.555 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:42.555 "is_configured": false, 00:24:42.555 "data_offset": 2048, 00:24:42.555 "data_size": 63488 00:24:42.555 }, 00:24:42.555 { 00:24:42.555 "name": "BaseBdev2", 00:24:42.555 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:42.555 "is_configured": true, 00:24:42.555 "data_offset": 2048, 00:24:42.555 "data_size": 63488 00:24:42.555 } 00:24:42.555 ] 00:24:42.555 }' 00:24:42.555 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:42.555 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:43.122 00:19:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:43.405 [2024-07-16 00:19:30.168397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:43.406 [2024-07-16 00:19:30.168455] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:43.406 [2024-07-16 00:19:30.168482] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26baf80 00:24:43.406 [2024-07-16 00:19:30.168495] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:43.406 [2024-07-16 00:19:30.168879] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:43.406 [2024-07-16 00:19:30.168896] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:43.406 [2024-07-16 00:19:30.168976] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:43.406 [2024-07-16 00:19:30.168994] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:43.406 [2024-07-16 00:19:30.169005] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:43.406 [2024-07-16 00:19:30.169024] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:43.406 [2024-07-16 00:19:30.174342] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26bb210 00:24:43.406 spare 00:24:43.406 [2024-07-16 00:19:30.175798] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:43.406 00:19:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:44.384 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:44.384 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:44.384 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:44.384 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:44.384 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:44.384 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.384 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.643 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:44.643 "name": "raid_bdev1", 00:24:44.643 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:44.643 "strip_size_kb": 0, 00:24:44.643 "state": "online", 00:24:44.643 "raid_level": "raid1", 00:24:44.643 "superblock": true, 00:24:44.643 "num_base_bdevs": 2, 00:24:44.643 "num_base_bdevs_discovered": 2, 00:24:44.643 "num_base_bdevs_operational": 2, 00:24:44.643 "process": { 00:24:44.643 "type": "rebuild", 00:24:44.643 "target": "spare", 00:24:44.643 "progress": { 00:24:44.643 "blocks": 24576, 00:24:44.643 "percent": 38 00:24:44.643 } 00:24:44.643 }, 00:24:44.643 "base_bdevs_list": [ 00:24:44.643 { 00:24:44.643 "name": "spare", 00:24:44.643 "uuid": "8a260376-78be-587c-bf27-dcd896ac0ede", 00:24:44.643 "is_configured": true, 00:24:44.643 "data_offset": 2048, 00:24:44.643 "data_size": 63488 00:24:44.643 }, 00:24:44.643 { 00:24:44.643 "name": "BaseBdev2", 00:24:44.643 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:44.643 "is_configured": true, 00:24:44.643 "data_offset": 2048, 00:24:44.643 "data_size": 63488 00:24:44.643 } 00:24:44.643 ] 00:24:44.643 }' 00:24:44.643 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:44.643 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:44.643 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:44.643 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:44.643 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:44.901 [2024-07-16 00:19:31.771596] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:44.901 [2024-07-16 00:19:31.788475] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:44.901 [2024-07-16 00:19:31.788519] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:44.901 [2024-07-16 00:19:31.788534] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:44.901 [2024-07-16 00:19:31.788542] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:44.901 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:44.901 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:44.901 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:44.901 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:44.901 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:44.901 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:44.901 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.902 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.902 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.902 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.902 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.902 00:19:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.160 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:45.160 "name": "raid_bdev1", 00:24:45.160 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:45.160 "strip_size_kb": 0, 00:24:45.160 "state": "online", 00:24:45.160 "raid_level": "raid1", 00:24:45.160 "superblock": true, 00:24:45.160 "num_base_bdevs": 2, 00:24:45.160 "num_base_bdevs_discovered": 1, 00:24:45.160 "num_base_bdevs_operational": 1, 00:24:45.160 "base_bdevs_list": [ 00:24:45.160 { 00:24:45.160 "name": null, 00:24:45.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.160 "is_configured": false, 00:24:45.160 "data_offset": 2048, 00:24:45.160 "data_size": 63488 00:24:45.160 }, 00:24:45.160 { 00:24:45.160 "name": "BaseBdev2", 00:24:45.160 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:45.160 "is_configured": true, 00:24:45.160 "data_offset": 2048, 00:24:45.160 "data_size": 63488 00:24:45.160 } 00:24:45.160 ] 00:24:45.160 }' 00:24:45.160 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:45.160 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:45.726 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:45.726 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.726 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:45.726 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:45.726 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.726 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.726 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.984 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.984 "name": "raid_bdev1", 00:24:45.984 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:45.984 "strip_size_kb": 0, 00:24:45.984 "state": "online", 00:24:45.984 "raid_level": "raid1", 00:24:45.984 "superblock": true, 00:24:45.984 "num_base_bdevs": 2, 00:24:45.984 "num_base_bdevs_discovered": 1, 00:24:45.984 "num_base_bdevs_operational": 1, 00:24:45.984 "base_bdevs_list": [ 00:24:45.984 { 00:24:45.984 "name": null, 00:24:45.984 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.984 "is_configured": false, 00:24:45.984 "data_offset": 2048, 00:24:45.984 "data_size": 63488 00:24:45.984 }, 00:24:45.984 { 00:24:45.984 "name": "BaseBdev2", 00:24:45.984 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:45.984 "is_configured": true, 00:24:45.984 "data_offset": 2048, 00:24:45.984 "data_size": 63488 00:24:45.984 } 00:24:45.984 ] 00:24:45.984 }' 00:24:45.984 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:46.242 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:46.242 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:46.242 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:46.242 00:19:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:46.500 00:19:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:46.757 [2024-07-16 00:19:33.461688] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:46.757 [2024-07-16 00:19:33.461737] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:46.757 [2024-07-16 00:19:33.461758] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26c0270 00:24:46.757 [2024-07-16 00:19:33.461772] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:46.757 [2024-07-16 00:19:33.462132] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:46.757 [2024-07-16 00:19:33.462150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:46.757 [2024-07-16 00:19:33.462215] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:46.757 [2024-07-16 00:19:33.462227] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:46.757 [2024-07-16 00:19:33.462238] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:46.757 BaseBdev1 00:24:46.757 00:19:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.692 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.951 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:47.951 "name": "raid_bdev1", 00:24:47.951 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:47.951 "strip_size_kb": 0, 00:24:47.951 "state": "online", 00:24:47.951 "raid_level": "raid1", 00:24:47.951 "superblock": true, 00:24:47.951 "num_base_bdevs": 2, 00:24:47.951 "num_base_bdevs_discovered": 1, 00:24:47.951 "num_base_bdevs_operational": 1, 00:24:47.951 "base_bdevs_list": [ 00:24:47.951 { 00:24:47.951 "name": null, 00:24:47.951 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:47.951 "is_configured": false, 00:24:47.951 "data_offset": 2048, 00:24:47.951 "data_size": 63488 00:24:47.951 }, 00:24:47.951 { 00:24:47.951 "name": "BaseBdev2", 00:24:47.951 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:47.951 "is_configured": true, 00:24:47.951 "data_offset": 2048, 00:24:47.951 "data_size": 63488 00:24:47.951 } 00:24:47.951 ] 00:24:47.951 }' 00:24:47.951 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:47.951 00:19:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:48.518 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:48.518 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:48.518 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:48.518 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:48.518 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:48.518 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.518 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:48.777 "name": "raid_bdev1", 00:24:48.777 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:48.777 "strip_size_kb": 0, 00:24:48.777 "state": "online", 00:24:48.777 "raid_level": "raid1", 00:24:48.777 "superblock": true, 00:24:48.777 "num_base_bdevs": 2, 00:24:48.777 "num_base_bdevs_discovered": 1, 00:24:48.777 "num_base_bdevs_operational": 1, 00:24:48.777 "base_bdevs_list": [ 00:24:48.777 { 00:24:48.777 "name": null, 00:24:48.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.777 "is_configured": false, 00:24:48.777 "data_offset": 2048, 00:24:48.777 "data_size": 63488 00:24:48.777 }, 00:24:48.777 { 00:24:48.777 "name": "BaseBdev2", 00:24:48.777 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:48.777 "is_configured": true, 00:24:48.777 "data_offset": 2048, 00:24:48.777 "data_size": 63488 00:24:48.777 } 00:24:48.777 ] 00:24:48.777 }' 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:48.777 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:49.036 [2024-07-16 00:19:35.872423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:49.036 [2024-07-16 00:19:35.872549] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:49.036 [2024-07-16 00:19:35.872564] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:49.036 request: 00:24:49.036 { 00:24:49.036 "base_bdev": "BaseBdev1", 00:24:49.036 "raid_bdev": "raid_bdev1", 00:24:49.036 "method": "bdev_raid_add_base_bdev", 00:24:49.036 "req_id": 1 00:24:49.036 } 00:24:49.036 Got JSON-RPC error response 00:24:49.036 response: 00:24:49.036 { 00:24:49.036 "code": -22, 00:24:49.036 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:49.036 } 00:24:49.036 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:24:49.036 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:49.036 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:49.036 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:49.036 00:19:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.972 00:19:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.230 00:19:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:50.230 "name": "raid_bdev1", 00:24:50.230 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:50.230 "strip_size_kb": 0, 00:24:50.230 "state": "online", 00:24:50.230 "raid_level": "raid1", 00:24:50.230 "superblock": true, 00:24:50.230 "num_base_bdevs": 2, 00:24:50.230 "num_base_bdevs_discovered": 1, 00:24:50.230 "num_base_bdevs_operational": 1, 00:24:50.230 "base_bdevs_list": [ 00:24:50.230 { 00:24:50.230 "name": null, 00:24:50.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:50.230 "is_configured": false, 00:24:50.230 "data_offset": 2048, 00:24:50.230 "data_size": 63488 00:24:50.230 }, 00:24:50.230 { 00:24:50.230 "name": "BaseBdev2", 00:24:50.230 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:50.230 "is_configured": true, 00:24:50.230 "data_offset": 2048, 00:24:50.230 "data_size": 63488 00:24:50.230 } 00:24:50.230 ] 00:24:50.230 }' 00:24:50.230 00:19:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:50.230 00:19:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:51.166 00:19:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:51.166 00:19:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:51.166 00:19:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:51.166 00:19:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:51.166 00:19:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:51.166 00:19:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.166 00:19:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.166 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:51.166 "name": "raid_bdev1", 00:24:51.166 "uuid": "92195420-7f23-4eb4-b66d-e9fc0a288ea5", 00:24:51.166 "strip_size_kb": 0, 00:24:51.166 "state": "online", 00:24:51.166 "raid_level": "raid1", 00:24:51.166 "superblock": true, 00:24:51.166 "num_base_bdevs": 2, 00:24:51.166 "num_base_bdevs_discovered": 1, 00:24:51.166 "num_base_bdevs_operational": 1, 00:24:51.166 "base_bdevs_list": [ 00:24:51.166 { 00:24:51.166 "name": null, 00:24:51.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.166 "is_configured": false, 00:24:51.166 "data_offset": 2048, 00:24:51.166 "data_size": 63488 00:24:51.166 }, 00:24:51.166 { 00:24:51.166 "name": "BaseBdev2", 00:24:51.166 "uuid": "8abf32e3-1190-5fef-bd7a-c9c4b3159b56", 00:24:51.166 "is_configured": true, 00:24:51.166 "data_offset": 2048, 00:24:51.166 "data_size": 63488 00:24:51.166 } 00:24:51.166 ] 00:24:51.166 }' 00:24:51.166 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:51.166 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:51.166 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 3609251 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 3609251 ']' 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 3609251 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3609251 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3609251' 00:24:51.427 killing process with pid 3609251 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 3609251 00:24:51.427 Received shutdown signal, test time was about 27.205917 seconds 00:24:51.427 00:24:51.427 Latency(us) 00:24:51.427 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:51.427 =================================================================================================================== 00:24:51.427 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:51.427 [2024-07-16 00:19:38.162948] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:51.427 [2024-07-16 00:19:38.163039] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:51.427 [2024-07-16 00:19:38.163085] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:51.427 [2024-07-16 00:19:38.163097] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250fb60 name raid_bdev1, state offline 00:24:51.427 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 3609251 00:24:51.427 [2024-07-16 00:19:38.184384] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:51.686 00:24:51.686 real 0m31.418s 00:24:51.686 user 0m49.132s 00:24:51.686 sys 0m4.549s 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:51.686 ************************************ 00:24:51.686 END TEST raid_rebuild_test_sb_io 00:24:51.686 ************************************ 00:24:51.686 00:19:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:51.686 00:19:38 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:24:51.686 00:19:38 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:24:51.686 00:19:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:51.686 00:19:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:51.686 00:19:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:51.686 ************************************ 00:24:51.686 START TEST raid_rebuild_test 00:24:51.686 ************************************ 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:51.686 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=3613734 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 3613734 /var/tmp/spdk-raid.sock 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 3613734 ']' 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:51.687 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:51.687 00:19:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:51.687 [2024-07-16 00:19:38.555291] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:24:51.687 [2024-07-16 00:19:38.555358] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3613734 ] 00:24:51.687 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:51.687 Zero copy mechanism will not be used. 00:24:51.945 [2024-07-16 00:19:38.686137] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:51.946 [2024-07-16 00:19:38.792179] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:51.946 [2024-07-16 00:19:38.858761] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:51.946 [2024-07-16 00:19:38.858800] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:52.885 00:19:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:52.885 00:19:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:24:52.885 00:19:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:52.885 00:19:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:52.885 BaseBdev1_malloc 00:24:52.885 00:19:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:53.143 [2024-07-16 00:19:39.957992] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:53.143 [2024-07-16 00:19:39.958035] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:53.143 [2024-07-16 00:19:39.958057] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb11d40 00:24:53.143 [2024-07-16 00:19:39.958070] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:53.143 [2024-07-16 00:19:39.959682] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:53.143 [2024-07-16 00:19:39.959711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:53.143 BaseBdev1 00:24:53.143 00:19:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:53.143 00:19:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:53.402 BaseBdev2_malloc 00:24:53.402 00:19:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:53.661 [2024-07-16 00:19:40.464240] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:53.661 [2024-07-16 00:19:40.464284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:53.661 [2024-07-16 00:19:40.464307] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb12860 00:24:53.661 [2024-07-16 00:19:40.464320] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:53.661 [2024-07-16 00:19:40.465850] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:53.661 [2024-07-16 00:19:40.465876] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:53.661 BaseBdev2 00:24:53.661 00:19:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:53.661 00:19:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:53.919 BaseBdev3_malloc 00:24:53.919 00:19:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:54.177 [2024-07-16 00:19:40.958510] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:54.177 [2024-07-16 00:19:40.958557] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:54.177 [2024-07-16 00:19:40.958578] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcbf8f0 00:24:54.177 [2024-07-16 00:19:40.958591] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:54.177 [2024-07-16 00:19:40.960167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:54.177 [2024-07-16 00:19:40.960211] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:54.177 BaseBdev3 00:24:54.177 00:19:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:54.177 00:19:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:54.435 BaseBdev4_malloc 00:24:54.435 00:19:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:54.693 [2024-07-16 00:19:41.441600] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:54.693 [2024-07-16 00:19:41.441645] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:54.693 [2024-07-16 00:19:41.441666] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcbead0 00:24:54.693 [2024-07-16 00:19:41.441678] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:54.693 [2024-07-16 00:19:41.443234] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:54.693 [2024-07-16 00:19:41.443261] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:54.693 BaseBdev4 00:24:54.693 00:19:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:54.951 spare_malloc 00:24:54.951 00:19:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:55.209 spare_delay 00:24:55.209 00:19:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:55.466 [2024-07-16 00:19:42.173382] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:55.466 [2024-07-16 00:19:42.173427] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:55.466 [2024-07-16 00:19:42.173447] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcc35b0 00:24:55.466 [2024-07-16 00:19:42.173460] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:55.466 [2024-07-16 00:19:42.175015] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:55.466 [2024-07-16 00:19:42.175043] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:55.466 spare 00:24:55.466 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:55.723 [2024-07-16 00:19:42.418051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:55.723 [2024-07-16 00:19:42.419379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:55.723 [2024-07-16 00:19:42.419434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:55.723 [2024-07-16 00:19:42.419479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:55.723 [2024-07-16 00:19:42.419559] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc428a0 00:24:55.724 [2024-07-16 00:19:42.419569] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:55.724 [2024-07-16 00:19:42.419781] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcbce10 00:24:55.724 [2024-07-16 00:19:42.419937] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc428a0 00:24:55.724 [2024-07-16 00:19:42.419948] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc428a0 00:24:55.724 [2024-07-16 00:19:42.420062] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.724 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.980 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:55.980 "name": "raid_bdev1", 00:24:55.980 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:24:55.980 "strip_size_kb": 0, 00:24:55.980 "state": "online", 00:24:55.980 "raid_level": "raid1", 00:24:55.980 "superblock": false, 00:24:55.980 "num_base_bdevs": 4, 00:24:55.980 "num_base_bdevs_discovered": 4, 00:24:55.980 "num_base_bdevs_operational": 4, 00:24:55.980 "base_bdevs_list": [ 00:24:55.980 { 00:24:55.980 "name": "BaseBdev1", 00:24:55.980 "uuid": "f6b9ed5c-c4ad-5f88-a16d-541105eb1163", 00:24:55.980 "is_configured": true, 00:24:55.980 "data_offset": 0, 00:24:55.980 "data_size": 65536 00:24:55.980 }, 00:24:55.980 { 00:24:55.980 "name": "BaseBdev2", 00:24:55.980 "uuid": "fe4d7852-fb77-595a-b09f-661231722922", 00:24:55.980 "is_configured": true, 00:24:55.980 "data_offset": 0, 00:24:55.980 "data_size": 65536 00:24:55.980 }, 00:24:55.980 { 00:24:55.980 "name": "BaseBdev3", 00:24:55.980 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:24:55.980 "is_configured": true, 00:24:55.980 "data_offset": 0, 00:24:55.980 "data_size": 65536 00:24:55.980 }, 00:24:55.980 { 00:24:55.980 "name": "BaseBdev4", 00:24:55.980 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:24:55.980 "is_configured": true, 00:24:55.980 "data_offset": 0, 00:24:55.980 "data_size": 65536 00:24:55.980 } 00:24:55.980 ] 00:24:55.980 }' 00:24:55.980 00:19:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:55.980 00:19:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:56.545 00:19:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:56.545 00:19:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:56.804 [2024-07-16 00:19:43.513213] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:56.804 00:19:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:56.804 00:19:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.804 00:19:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:57.062 00:19:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:57.063 [2024-07-16 00:19:44.006278] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcbce10 00:24:57.321 /dev/nbd0 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:57.321 1+0 records in 00:24:57.321 1+0 records out 00:24:57.321 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249939 s, 16.4 MB/s 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:57.321 00:19:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:25:05.489 65536+0 records in 00:25:05.489 65536+0 records out 00:25:05.489 33554432 bytes (34 MB, 32 MiB) copied, 8.15469 s, 4.1 MB/s 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:05.489 [2024-07-16 00:19:52.425374] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:05.489 00:19:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:05.747 [2024-07-16 00:19:52.589857] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.747 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.005 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:06.005 "name": "raid_bdev1", 00:25:06.005 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:25:06.005 "strip_size_kb": 0, 00:25:06.005 "state": "online", 00:25:06.005 "raid_level": "raid1", 00:25:06.005 "superblock": false, 00:25:06.005 "num_base_bdevs": 4, 00:25:06.005 "num_base_bdevs_discovered": 3, 00:25:06.005 "num_base_bdevs_operational": 3, 00:25:06.005 "base_bdevs_list": [ 00:25:06.005 { 00:25:06.005 "name": null, 00:25:06.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:06.005 "is_configured": false, 00:25:06.005 "data_offset": 0, 00:25:06.005 "data_size": 65536 00:25:06.005 }, 00:25:06.005 { 00:25:06.005 "name": "BaseBdev2", 00:25:06.005 "uuid": "fe4d7852-fb77-595a-b09f-661231722922", 00:25:06.005 "is_configured": true, 00:25:06.005 "data_offset": 0, 00:25:06.005 "data_size": 65536 00:25:06.005 }, 00:25:06.005 { 00:25:06.005 "name": "BaseBdev3", 00:25:06.005 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:25:06.005 "is_configured": true, 00:25:06.005 "data_offset": 0, 00:25:06.005 "data_size": 65536 00:25:06.005 }, 00:25:06.005 { 00:25:06.005 "name": "BaseBdev4", 00:25:06.005 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:25:06.005 "is_configured": true, 00:25:06.005 "data_offset": 0, 00:25:06.005 "data_size": 65536 00:25:06.005 } 00:25:06.005 ] 00:25:06.005 }' 00:25:06.005 00:19:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:06.005 00:19:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:06.570 00:19:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:06.828 [2024-07-16 00:19:53.708829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:06.828 [2024-07-16 00:19:53.712920] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc486b0 00:25:06.828 [2024-07-16 00:19:53.715469] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:06.828 00:19:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:08.207 00:19:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:08.207 00:19:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:08.207 00:19:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:08.207 00:19:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:08.207 00:19:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:08.207 00:19:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.207 00:19:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:08.207 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:08.207 "name": "raid_bdev1", 00:25:08.207 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:25:08.207 "strip_size_kb": 0, 00:25:08.207 "state": "online", 00:25:08.207 "raid_level": "raid1", 00:25:08.207 "superblock": false, 00:25:08.207 "num_base_bdevs": 4, 00:25:08.207 "num_base_bdevs_discovered": 4, 00:25:08.207 "num_base_bdevs_operational": 4, 00:25:08.207 "process": { 00:25:08.207 "type": "rebuild", 00:25:08.207 "target": "spare", 00:25:08.207 "progress": { 00:25:08.207 "blocks": 24576, 00:25:08.207 "percent": 37 00:25:08.207 } 00:25:08.207 }, 00:25:08.207 "base_bdevs_list": [ 00:25:08.207 { 00:25:08.207 "name": "spare", 00:25:08.207 "uuid": "f894f823-fd07-51ac-bc93-88cd065ae5c1", 00:25:08.207 "is_configured": true, 00:25:08.207 "data_offset": 0, 00:25:08.207 "data_size": 65536 00:25:08.207 }, 00:25:08.207 { 00:25:08.207 "name": "BaseBdev2", 00:25:08.207 "uuid": "fe4d7852-fb77-595a-b09f-661231722922", 00:25:08.207 "is_configured": true, 00:25:08.207 "data_offset": 0, 00:25:08.207 "data_size": 65536 00:25:08.207 }, 00:25:08.207 { 00:25:08.207 "name": "BaseBdev3", 00:25:08.207 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:25:08.207 "is_configured": true, 00:25:08.207 "data_offset": 0, 00:25:08.207 "data_size": 65536 00:25:08.207 }, 00:25:08.207 { 00:25:08.207 "name": "BaseBdev4", 00:25:08.207 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:25:08.207 "is_configured": true, 00:25:08.207 "data_offset": 0, 00:25:08.207 "data_size": 65536 00:25:08.207 } 00:25:08.207 ] 00:25:08.207 }' 00:25:08.207 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:08.207 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:08.207 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:08.207 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:08.207 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:08.467 [2024-07-16 00:19:55.321826] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:08.467 [2024-07-16 00:19:55.328203] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:08.467 [2024-07-16 00:19:55.328245] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:08.467 [2024-07-16 00:19:55.328262] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:08.467 [2024-07-16 00:19:55.328271] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.467 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.036 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:09.036 "name": "raid_bdev1", 00:25:09.036 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:25:09.036 "strip_size_kb": 0, 00:25:09.036 "state": "online", 00:25:09.036 "raid_level": "raid1", 00:25:09.036 "superblock": false, 00:25:09.036 "num_base_bdevs": 4, 00:25:09.036 "num_base_bdevs_discovered": 3, 00:25:09.036 "num_base_bdevs_operational": 3, 00:25:09.036 "base_bdevs_list": [ 00:25:09.036 { 00:25:09.036 "name": null, 00:25:09.036 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:09.036 "is_configured": false, 00:25:09.036 "data_offset": 0, 00:25:09.036 "data_size": 65536 00:25:09.036 }, 00:25:09.036 { 00:25:09.036 "name": "BaseBdev2", 00:25:09.036 "uuid": "fe4d7852-fb77-595a-b09f-661231722922", 00:25:09.036 "is_configured": true, 00:25:09.036 "data_offset": 0, 00:25:09.036 "data_size": 65536 00:25:09.036 }, 00:25:09.036 { 00:25:09.036 "name": "BaseBdev3", 00:25:09.036 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:25:09.036 "is_configured": true, 00:25:09.036 "data_offset": 0, 00:25:09.036 "data_size": 65536 00:25:09.036 }, 00:25:09.036 { 00:25:09.036 "name": "BaseBdev4", 00:25:09.036 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:25:09.036 "is_configured": true, 00:25:09.036 "data_offset": 0, 00:25:09.036 "data_size": 65536 00:25:09.036 } 00:25:09.036 ] 00:25:09.036 }' 00:25:09.036 00:19:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:09.036 00:19:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:09.604 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:09.604 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.604 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:09.604 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:09.604 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.604 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.604 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.871 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.871 "name": "raid_bdev1", 00:25:09.871 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:25:09.871 "strip_size_kb": 0, 00:25:09.871 "state": "online", 00:25:09.871 "raid_level": "raid1", 00:25:09.871 "superblock": false, 00:25:09.871 "num_base_bdevs": 4, 00:25:09.871 "num_base_bdevs_discovered": 3, 00:25:09.871 "num_base_bdevs_operational": 3, 00:25:09.871 "base_bdevs_list": [ 00:25:09.871 { 00:25:09.871 "name": null, 00:25:09.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:09.871 "is_configured": false, 00:25:09.871 "data_offset": 0, 00:25:09.871 "data_size": 65536 00:25:09.871 }, 00:25:09.871 { 00:25:09.871 "name": "BaseBdev2", 00:25:09.871 "uuid": "fe4d7852-fb77-595a-b09f-661231722922", 00:25:09.871 "is_configured": true, 00:25:09.871 "data_offset": 0, 00:25:09.871 "data_size": 65536 00:25:09.871 }, 00:25:09.871 { 00:25:09.871 "name": "BaseBdev3", 00:25:09.871 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:25:09.871 "is_configured": true, 00:25:09.871 "data_offset": 0, 00:25:09.871 "data_size": 65536 00:25:09.871 }, 00:25:09.871 { 00:25:09.871 "name": "BaseBdev4", 00:25:09.871 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:25:09.871 "is_configured": true, 00:25:09.871 "data_offset": 0, 00:25:09.871 "data_size": 65536 00:25:09.871 } 00:25:09.871 ] 00:25:09.871 }' 00:25:09.871 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.871 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:09.871 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:09.871 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:09.871 00:19:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:10.130 [2024-07-16 00:19:57.025311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:10.130 [2024-07-16 00:19:57.029474] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc486b0 00:25:10.130 [2024-07-16 00:19:57.030987] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:10.130 00:19:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:11.507 "name": "raid_bdev1", 00:25:11.507 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:25:11.507 "strip_size_kb": 0, 00:25:11.507 "state": "online", 00:25:11.507 "raid_level": "raid1", 00:25:11.507 "superblock": false, 00:25:11.507 "num_base_bdevs": 4, 00:25:11.507 "num_base_bdevs_discovered": 4, 00:25:11.507 "num_base_bdevs_operational": 4, 00:25:11.507 "process": { 00:25:11.507 "type": "rebuild", 00:25:11.507 "target": "spare", 00:25:11.507 "progress": { 00:25:11.507 "blocks": 24576, 00:25:11.507 "percent": 37 00:25:11.507 } 00:25:11.507 }, 00:25:11.507 "base_bdevs_list": [ 00:25:11.507 { 00:25:11.507 "name": "spare", 00:25:11.507 "uuid": "f894f823-fd07-51ac-bc93-88cd065ae5c1", 00:25:11.507 "is_configured": true, 00:25:11.507 "data_offset": 0, 00:25:11.507 "data_size": 65536 00:25:11.507 }, 00:25:11.507 { 00:25:11.507 "name": "BaseBdev2", 00:25:11.507 "uuid": "fe4d7852-fb77-595a-b09f-661231722922", 00:25:11.507 "is_configured": true, 00:25:11.507 "data_offset": 0, 00:25:11.507 "data_size": 65536 00:25:11.507 }, 00:25:11.507 { 00:25:11.507 "name": "BaseBdev3", 00:25:11.507 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:25:11.507 "is_configured": true, 00:25:11.507 "data_offset": 0, 00:25:11.507 "data_size": 65536 00:25:11.507 }, 00:25:11.507 { 00:25:11.507 "name": "BaseBdev4", 00:25:11.507 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:25:11.507 "is_configured": true, 00:25:11.507 "data_offset": 0, 00:25:11.507 "data_size": 65536 00:25:11.507 } 00:25:11.507 ] 00:25:11.507 }' 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:11.507 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:11.765 [2024-07-16 00:19:58.611065] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:11.765 [2024-07-16 00:19:58.643567] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xc486b0 00:25:11.765 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:11.765 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:11.765 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:11.765 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:11.765 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:11.765 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:11.765 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:11.765 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.765 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.023 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.023 "name": "raid_bdev1", 00:25:12.023 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:25:12.023 "strip_size_kb": 0, 00:25:12.023 "state": "online", 00:25:12.023 "raid_level": "raid1", 00:25:12.023 "superblock": false, 00:25:12.023 "num_base_bdevs": 4, 00:25:12.023 "num_base_bdevs_discovered": 3, 00:25:12.023 "num_base_bdevs_operational": 3, 00:25:12.023 "process": { 00:25:12.023 "type": "rebuild", 00:25:12.023 "target": "spare", 00:25:12.023 "progress": { 00:25:12.023 "blocks": 36864, 00:25:12.023 "percent": 56 00:25:12.023 } 00:25:12.023 }, 00:25:12.023 "base_bdevs_list": [ 00:25:12.023 { 00:25:12.023 "name": "spare", 00:25:12.023 "uuid": "f894f823-fd07-51ac-bc93-88cd065ae5c1", 00:25:12.023 "is_configured": true, 00:25:12.023 "data_offset": 0, 00:25:12.023 "data_size": 65536 00:25:12.023 }, 00:25:12.023 { 00:25:12.023 "name": null, 00:25:12.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.023 "is_configured": false, 00:25:12.023 "data_offset": 0, 00:25:12.023 "data_size": 65536 00:25:12.023 }, 00:25:12.023 { 00:25:12.023 "name": "BaseBdev3", 00:25:12.023 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:25:12.023 "is_configured": true, 00:25:12.023 "data_offset": 0, 00:25:12.023 "data_size": 65536 00:25:12.023 }, 00:25:12.023 { 00:25:12.023 "name": "BaseBdev4", 00:25:12.023 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:25:12.023 "is_configured": true, 00:25:12.023 "data_offset": 0, 00:25:12.023 "data_size": 65536 00:25:12.023 } 00:25:12.023 ] 00:25:12.023 }' 00:25:12.023 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.023 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:12.023 00:19:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:12.280 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:12.280 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=909 00:25:12.280 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:12.280 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:12.280 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:12.280 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:12.280 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:12.280 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:12.280 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.280 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.538 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.538 "name": "raid_bdev1", 00:25:12.538 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:25:12.538 "strip_size_kb": 0, 00:25:12.538 "state": "online", 00:25:12.538 "raid_level": "raid1", 00:25:12.538 "superblock": false, 00:25:12.538 "num_base_bdevs": 4, 00:25:12.538 "num_base_bdevs_discovered": 3, 00:25:12.538 "num_base_bdevs_operational": 3, 00:25:12.538 "process": { 00:25:12.538 "type": "rebuild", 00:25:12.538 "target": "spare", 00:25:12.538 "progress": { 00:25:12.538 "blocks": 43008, 00:25:12.538 "percent": 65 00:25:12.538 } 00:25:12.538 }, 00:25:12.538 "base_bdevs_list": [ 00:25:12.538 { 00:25:12.538 "name": "spare", 00:25:12.538 "uuid": "f894f823-fd07-51ac-bc93-88cd065ae5c1", 00:25:12.538 "is_configured": true, 00:25:12.538 "data_offset": 0, 00:25:12.538 "data_size": 65536 00:25:12.538 }, 00:25:12.538 { 00:25:12.538 "name": null, 00:25:12.538 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.538 "is_configured": false, 00:25:12.538 "data_offset": 0, 00:25:12.538 "data_size": 65536 00:25:12.538 }, 00:25:12.538 { 00:25:12.538 "name": "BaseBdev3", 00:25:12.538 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:25:12.538 "is_configured": true, 00:25:12.538 "data_offset": 0, 00:25:12.538 "data_size": 65536 00:25:12.538 }, 00:25:12.538 { 00:25:12.538 "name": "BaseBdev4", 00:25:12.538 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:25:12.538 "is_configured": true, 00:25:12.538 "data_offset": 0, 00:25:12.538 "data_size": 65536 00:25:12.538 } 00:25:12.538 ] 00:25:12.538 }' 00:25:12.538 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.539 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:12.539 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:12.539 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:12.539 00:19:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:13.471 [2024-07-16 00:20:00.256324] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:13.471 [2024-07-16 00:20:00.256391] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:13.471 [2024-07-16 00:20:00.256430] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:13.471 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:13.471 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:13.471 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.471 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:13.471 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:13.471 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.471 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.471 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.730 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.730 "name": "raid_bdev1", 00:25:13.730 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:25:13.730 "strip_size_kb": 0, 00:25:13.730 "state": "online", 00:25:13.730 "raid_level": "raid1", 00:25:13.730 "superblock": false, 00:25:13.730 "num_base_bdevs": 4, 00:25:13.730 "num_base_bdevs_discovered": 3, 00:25:13.730 "num_base_bdevs_operational": 3, 00:25:13.730 "base_bdevs_list": [ 00:25:13.730 { 00:25:13.730 "name": "spare", 00:25:13.730 "uuid": "f894f823-fd07-51ac-bc93-88cd065ae5c1", 00:25:13.730 "is_configured": true, 00:25:13.730 "data_offset": 0, 00:25:13.730 "data_size": 65536 00:25:13.730 }, 00:25:13.730 { 00:25:13.730 "name": null, 00:25:13.730 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.730 "is_configured": false, 00:25:13.730 "data_offset": 0, 00:25:13.730 "data_size": 65536 00:25:13.730 }, 00:25:13.730 { 00:25:13.730 "name": "BaseBdev3", 00:25:13.730 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:25:13.730 "is_configured": true, 00:25:13.730 "data_offset": 0, 00:25:13.730 "data_size": 65536 00:25:13.730 }, 00:25:13.730 { 00:25:13.730 "name": "BaseBdev4", 00:25:13.730 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:25:13.730 "is_configured": true, 00:25:13.730 "data_offset": 0, 00:25:13.730 "data_size": 65536 00:25:13.730 } 00:25:13.730 ] 00:25:13.730 }' 00:25:13.730 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.730 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:13.730 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.988 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:13.988 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:25:13.988 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.988 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.988 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:13.988 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:13.988 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.988 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.988 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.247 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:14.247 "name": "raid_bdev1", 00:25:14.247 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:25:14.247 "strip_size_kb": 0, 00:25:14.247 "state": "online", 00:25:14.247 "raid_level": "raid1", 00:25:14.247 "superblock": false, 00:25:14.247 "num_base_bdevs": 4, 00:25:14.247 "num_base_bdevs_discovered": 3, 00:25:14.247 "num_base_bdevs_operational": 3, 00:25:14.247 "base_bdevs_list": [ 00:25:14.247 { 00:25:14.247 "name": "spare", 00:25:14.247 "uuid": "f894f823-fd07-51ac-bc93-88cd065ae5c1", 00:25:14.247 "is_configured": true, 00:25:14.247 "data_offset": 0, 00:25:14.247 "data_size": 65536 00:25:14.247 }, 00:25:14.247 { 00:25:14.247 "name": null, 00:25:14.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.247 "is_configured": false, 00:25:14.247 "data_offset": 0, 00:25:14.247 "data_size": 65536 00:25:14.247 }, 00:25:14.247 { 00:25:14.247 "name": "BaseBdev3", 00:25:14.247 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:25:14.247 "is_configured": true, 00:25:14.247 "data_offset": 0, 00:25:14.247 "data_size": 65536 00:25:14.247 }, 00:25:14.247 { 00:25:14.247 "name": "BaseBdev4", 00:25:14.247 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:25:14.247 "is_configured": true, 00:25:14.247 "data_offset": 0, 00:25:14.247 "data_size": 65536 00:25:14.247 } 00:25:14.247 ] 00:25:14.247 }' 00:25:14.247 00:20:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.247 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.506 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:14.506 "name": "raid_bdev1", 00:25:14.506 "uuid": "63b48633-0928-4854-b167-cae506627f25", 00:25:14.506 "strip_size_kb": 0, 00:25:14.506 "state": "online", 00:25:14.506 "raid_level": "raid1", 00:25:14.506 "superblock": false, 00:25:14.506 "num_base_bdevs": 4, 00:25:14.506 "num_base_bdevs_discovered": 3, 00:25:14.506 "num_base_bdevs_operational": 3, 00:25:14.506 "base_bdevs_list": [ 00:25:14.506 { 00:25:14.506 "name": "spare", 00:25:14.506 "uuid": "f894f823-fd07-51ac-bc93-88cd065ae5c1", 00:25:14.506 "is_configured": true, 00:25:14.506 "data_offset": 0, 00:25:14.506 "data_size": 65536 00:25:14.506 }, 00:25:14.506 { 00:25:14.506 "name": null, 00:25:14.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.506 "is_configured": false, 00:25:14.506 "data_offset": 0, 00:25:14.506 "data_size": 65536 00:25:14.506 }, 00:25:14.506 { 00:25:14.506 "name": "BaseBdev3", 00:25:14.506 "uuid": "fcb5b800-d0cb-500a-8ddc-75d3e85cbbbf", 00:25:14.506 "is_configured": true, 00:25:14.506 "data_offset": 0, 00:25:14.506 "data_size": 65536 00:25:14.506 }, 00:25:14.506 { 00:25:14.506 "name": "BaseBdev4", 00:25:14.506 "uuid": "f7f2935d-79f2-5dd8-b808-0c5b42cb8366", 00:25:14.506 "is_configured": true, 00:25:14.506 "data_offset": 0, 00:25:14.506 "data_size": 65536 00:25:14.506 } 00:25:14.506 ] 00:25:14.506 }' 00:25:14.506 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:14.506 00:20:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:15.072 00:20:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:15.331 [2024-07-16 00:20:02.106002] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:15.331 [2024-07-16 00:20:02.106027] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:15.331 [2024-07-16 00:20:02.106085] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:15.331 [2024-07-16 00:20:02.106152] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:15.331 [2024-07-16 00:20:02.106164] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc428a0 name raid_bdev1, state offline 00:25:15.331 00:20:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.331 00:20:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:15.589 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:15.847 /dev/nbd0 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.847 1+0 records in 00:25:15.847 1+0 records out 00:25:15.847 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265284 s, 15.4 MB/s 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:15.847 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:16.105 /dev/nbd1 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:16.105 1+0 records in 00:25:16.105 1+0 records out 00:25:16.105 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000513235 s, 8.0 MB/s 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:16.105 00:20:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:16.105 00:20:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:16.105 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:16.105 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:16.105 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:16.105 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:16.105 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:16.105 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:16.363 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:16.363 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:16.363 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:16.363 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:16.363 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:16.363 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:16.363 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:16.363 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:16.363 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:16.363 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 3613734 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 3613734 ']' 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 3613734 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3613734 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3613734' 00:25:16.927 killing process with pid 3613734 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 3613734 00:25:16.927 Received shutdown signal, test time was about 60.000000 seconds 00:25:16.927 00:25:16.927 Latency(us) 00:25:16.927 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:16.927 =================================================================================================================== 00:25:16.927 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:16.927 [2024-07-16 00:20:03.645205] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:16.927 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 3613734 00:25:16.927 [2024-07-16 00:20:03.695894] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:17.187 00:20:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:25:17.187 00:25:17.187 real 0m25.435s 00:25:17.187 user 0m33.785s 00:25:17.187 sys 0m5.783s 00:25:17.187 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:17.187 00:20:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:17.187 ************************************ 00:25:17.187 END TEST raid_rebuild_test 00:25:17.187 ************************************ 00:25:17.187 00:20:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:17.187 00:20:03 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:25:17.187 00:20:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:17.187 00:20:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:17.187 00:20:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:17.187 ************************************ 00:25:17.187 START TEST raid_rebuild_test_sb 00:25:17.187 ************************************ 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=3617132 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 3617132 /var/tmp/spdk-raid.sock 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3617132 ']' 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:17.187 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:17.187 00:20:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:17.187 [2024-07-16 00:20:04.083042] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:25:17.187 [2024-07-16 00:20:04.083115] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3617132 ] 00:25:17.187 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:17.187 Zero copy mechanism will not be used. 00:25:17.446 [2024-07-16 00:20:04.213522] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:17.446 [2024-07-16 00:20:04.315315] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:17.446 [2024-07-16 00:20:04.379596] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:17.446 [2024-07-16 00:20:04.379639] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:18.379 00:20:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:18.379 00:20:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:25:18.379 00:20:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:18.379 00:20:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:18.379 BaseBdev1_malloc 00:25:18.379 00:20:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:18.637 [2024-07-16 00:20:05.489740] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:18.637 [2024-07-16 00:20:05.489792] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.637 [2024-07-16 00:20:05.489816] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1841d40 00:25:18.637 [2024-07-16 00:20:05.489833] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.637 [2024-07-16 00:20:05.491427] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.637 [2024-07-16 00:20:05.491457] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:18.637 BaseBdev1 00:25:18.637 00:20:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:18.637 00:20:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:18.919 BaseBdev2_malloc 00:25:18.919 00:20:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:19.177 [2024-07-16 00:20:05.923805] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:19.177 [2024-07-16 00:20:05.923851] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:19.177 [2024-07-16 00:20:05.923874] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1842860 00:25:19.177 [2024-07-16 00:20:05.923887] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:19.177 [2024-07-16 00:20:05.925325] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:19.177 [2024-07-16 00:20:05.925354] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:19.177 BaseBdev2 00:25:19.177 00:20:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:19.177 00:20:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:19.435 BaseBdev3_malloc 00:25:19.435 00:20:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:19.692 [2024-07-16 00:20:06.417729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:19.692 [2024-07-16 00:20:06.417774] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:19.692 [2024-07-16 00:20:06.417795] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19ef8f0 00:25:19.692 [2024-07-16 00:20:06.417808] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:19.692 [2024-07-16 00:20:06.419179] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:19.692 [2024-07-16 00:20:06.419206] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:19.692 BaseBdev3 00:25:19.692 00:20:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:19.692 00:20:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:19.949 BaseBdev4_malloc 00:25:19.949 00:20:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:20.206 [2024-07-16 00:20:06.923674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:20.206 [2024-07-16 00:20:06.923720] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:20.206 [2024-07-16 00:20:06.923742] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19eead0 00:25:20.206 [2024-07-16 00:20:06.923755] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:20.206 [2024-07-16 00:20:06.925198] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:20.206 [2024-07-16 00:20:06.925226] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:20.206 BaseBdev4 00:25:20.206 00:20:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:20.206 spare_malloc 00:25:20.463 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:20.463 spare_delay 00:25:20.720 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:20.720 [2024-07-16 00:20:07.642239] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:20.720 [2024-07-16 00:20:07.642291] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:20.720 [2024-07-16 00:20:07.642312] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19f35b0 00:25:20.720 [2024-07-16 00:20:07.642324] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:20.720 [2024-07-16 00:20:07.643815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:20.720 [2024-07-16 00:20:07.643844] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:20.720 spare 00:25:20.978 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:20.978 [2024-07-16 00:20:07.898961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:20.978 [2024-07-16 00:20:07.900178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:20.978 [2024-07-16 00:20:07.900232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:20.978 [2024-07-16 00:20:07.900277] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:20.978 [2024-07-16 00:20:07.900471] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19728a0 00:25:20.978 [2024-07-16 00:20:07.900482] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:20.978 [2024-07-16 00:20:07.900679] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19ece10 00:25:20.978 [2024-07-16 00:20:07.900827] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19728a0 00:25:20.978 [2024-07-16 00:20:07.900837] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19728a0 00:25:20.978 [2024-07-16 00:20:07.900934] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.235 00:20:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.235 00:20:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:21.235 "name": "raid_bdev1", 00:25:21.235 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:21.236 "strip_size_kb": 0, 00:25:21.236 "state": "online", 00:25:21.236 "raid_level": "raid1", 00:25:21.236 "superblock": true, 00:25:21.236 "num_base_bdevs": 4, 00:25:21.236 "num_base_bdevs_discovered": 4, 00:25:21.236 "num_base_bdevs_operational": 4, 00:25:21.236 "base_bdevs_list": [ 00:25:21.236 { 00:25:21.236 "name": "BaseBdev1", 00:25:21.236 "uuid": "969fa1b3-fce9-5106-b65c-bccb4f0d3587", 00:25:21.236 "is_configured": true, 00:25:21.236 "data_offset": 2048, 00:25:21.236 "data_size": 63488 00:25:21.236 }, 00:25:21.236 { 00:25:21.236 "name": "BaseBdev2", 00:25:21.236 "uuid": "b58b617f-98ca-54a0-802d-c3c7a06c589f", 00:25:21.236 "is_configured": true, 00:25:21.236 "data_offset": 2048, 00:25:21.236 "data_size": 63488 00:25:21.236 }, 00:25:21.236 { 00:25:21.236 "name": "BaseBdev3", 00:25:21.236 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:21.236 "is_configured": true, 00:25:21.236 "data_offset": 2048, 00:25:21.236 "data_size": 63488 00:25:21.236 }, 00:25:21.236 { 00:25:21.236 "name": "BaseBdev4", 00:25:21.236 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:21.236 "is_configured": true, 00:25:21.236 "data_offset": 2048, 00:25:21.236 "data_size": 63488 00:25:21.236 } 00:25:21.236 ] 00:25:21.236 }' 00:25:21.236 00:20:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:21.236 00:20:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:22.165 00:20:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:22.165 00:20:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:22.165 [2024-07-16 00:20:09.002141] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:22.165 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:22.165 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.165 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:22.424 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:22.683 [2024-07-16 00:20:09.523259] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19ece10 00:25:22.683 /dev/nbd0 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:22.683 1+0 records in 00:25:22.683 1+0 records out 00:25:22.683 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236786 s, 17.3 MB/s 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:22.683 00:20:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:25:30.845 63488+0 records in 00:25:30.845 63488+0 records out 00:25:30.845 32505856 bytes (33 MB, 31 MiB) copied, 7.16703 s, 4.5 MB/s 00:25:30.845 00:20:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:30.845 00:20:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:30.845 00:20:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:30.845 00:20:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:30.845 00:20:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:30.845 00:20:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:30.845 00:20:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:30.845 [2024-07-16 00:20:17.282494] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:30.845 [2024-07-16 00:20:17.531212] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.845 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.104 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:31.104 "name": "raid_bdev1", 00:25:31.104 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:31.105 "strip_size_kb": 0, 00:25:31.105 "state": "online", 00:25:31.105 "raid_level": "raid1", 00:25:31.105 "superblock": true, 00:25:31.105 "num_base_bdevs": 4, 00:25:31.105 "num_base_bdevs_discovered": 3, 00:25:31.105 "num_base_bdevs_operational": 3, 00:25:31.105 "base_bdevs_list": [ 00:25:31.105 { 00:25:31.105 "name": null, 00:25:31.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:31.105 "is_configured": false, 00:25:31.105 "data_offset": 2048, 00:25:31.105 "data_size": 63488 00:25:31.105 }, 00:25:31.105 { 00:25:31.105 "name": "BaseBdev2", 00:25:31.105 "uuid": "b58b617f-98ca-54a0-802d-c3c7a06c589f", 00:25:31.105 "is_configured": true, 00:25:31.105 "data_offset": 2048, 00:25:31.105 "data_size": 63488 00:25:31.105 }, 00:25:31.105 { 00:25:31.105 "name": "BaseBdev3", 00:25:31.105 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:31.105 "is_configured": true, 00:25:31.105 "data_offset": 2048, 00:25:31.105 "data_size": 63488 00:25:31.105 }, 00:25:31.105 { 00:25:31.105 "name": "BaseBdev4", 00:25:31.105 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:31.105 "is_configured": true, 00:25:31.105 "data_offset": 2048, 00:25:31.105 "data_size": 63488 00:25:31.105 } 00:25:31.105 ] 00:25:31.105 }' 00:25:31.105 00:20:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:31.105 00:20:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:31.672 00:20:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:31.672 [2024-07-16 00:20:18.569973] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:31.672 [2024-07-16 00:20:18.574040] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19ece10 00:25:31.672 [2024-07-16 00:20:18.576402] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:31.672 00:20:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:33.051 "name": "raid_bdev1", 00:25:33.051 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:33.051 "strip_size_kb": 0, 00:25:33.051 "state": "online", 00:25:33.051 "raid_level": "raid1", 00:25:33.051 "superblock": true, 00:25:33.051 "num_base_bdevs": 4, 00:25:33.051 "num_base_bdevs_discovered": 4, 00:25:33.051 "num_base_bdevs_operational": 4, 00:25:33.051 "process": { 00:25:33.051 "type": "rebuild", 00:25:33.051 "target": "spare", 00:25:33.051 "progress": { 00:25:33.051 "blocks": 24576, 00:25:33.051 "percent": 38 00:25:33.051 } 00:25:33.051 }, 00:25:33.051 "base_bdevs_list": [ 00:25:33.051 { 00:25:33.051 "name": "spare", 00:25:33.051 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:33.051 "is_configured": true, 00:25:33.051 "data_offset": 2048, 00:25:33.051 "data_size": 63488 00:25:33.051 }, 00:25:33.051 { 00:25:33.051 "name": "BaseBdev2", 00:25:33.051 "uuid": "b58b617f-98ca-54a0-802d-c3c7a06c589f", 00:25:33.051 "is_configured": true, 00:25:33.051 "data_offset": 2048, 00:25:33.051 "data_size": 63488 00:25:33.051 }, 00:25:33.051 { 00:25:33.051 "name": "BaseBdev3", 00:25:33.051 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:33.051 "is_configured": true, 00:25:33.051 "data_offset": 2048, 00:25:33.051 "data_size": 63488 00:25:33.051 }, 00:25:33.051 { 00:25:33.051 "name": "BaseBdev4", 00:25:33.051 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:33.051 "is_configured": true, 00:25:33.051 "data_offset": 2048, 00:25:33.051 "data_size": 63488 00:25:33.051 } 00:25:33.051 ] 00:25:33.051 }' 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:33.051 00:20:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:33.310 [2024-07-16 00:20:20.163554] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:33.310 [2024-07-16 00:20:20.189048] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:33.310 [2024-07-16 00:20:20.189093] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:33.310 [2024-07-16 00:20:20.189110] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:33.310 [2024-07-16 00:20:20.189118] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.310 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.570 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:33.570 "name": "raid_bdev1", 00:25:33.570 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:33.570 "strip_size_kb": 0, 00:25:33.570 "state": "online", 00:25:33.570 "raid_level": "raid1", 00:25:33.570 "superblock": true, 00:25:33.570 "num_base_bdevs": 4, 00:25:33.570 "num_base_bdevs_discovered": 3, 00:25:33.570 "num_base_bdevs_operational": 3, 00:25:33.570 "base_bdevs_list": [ 00:25:33.570 { 00:25:33.570 "name": null, 00:25:33.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.570 "is_configured": false, 00:25:33.570 "data_offset": 2048, 00:25:33.570 "data_size": 63488 00:25:33.570 }, 00:25:33.570 { 00:25:33.570 "name": "BaseBdev2", 00:25:33.570 "uuid": "b58b617f-98ca-54a0-802d-c3c7a06c589f", 00:25:33.570 "is_configured": true, 00:25:33.570 "data_offset": 2048, 00:25:33.570 "data_size": 63488 00:25:33.570 }, 00:25:33.570 { 00:25:33.570 "name": "BaseBdev3", 00:25:33.570 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:33.570 "is_configured": true, 00:25:33.570 "data_offset": 2048, 00:25:33.570 "data_size": 63488 00:25:33.570 }, 00:25:33.570 { 00:25:33.570 "name": "BaseBdev4", 00:25:33.570 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:33.570 "is_configured": true, 00:25:33.570 "data_offset": 2048, 00:25:33.570 "data_size": 63488 00:25:33.570 } 00:25:33.570 ] 00:25:33.570 }' 00:25:33.570 00:20:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:33.570 00:20:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:34.137 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:34.137 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:34.137 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:34.137 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:34.137 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:34.137 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.137 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.397 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:34.397 "name": "raid_bdev1", 00:25:34.397 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:34.397 "strip_size_kb": 0, 00:25:34.397 "state": "online", 00:25:34.397 "raid_level": "raid1", 00:25:34.397 "superblock": true, 00:25:34.397 "num_base_bdevs": 4, 00:25:34.397 "num_base_bdevs_discovered": 3, 00:25:34.397 "num_base_bdevs_operational": 3, 00:25:34.397 "base_bdevs_list": [ 00:25:34.397 { 00:25:34.397 "name": null, 00:25:34.397 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.397 "is_configured": false, 00:25:34.397 "data_offset": 2048, 00:25:34.397 "data_size": 63488 00:25:34.397 }, 00:25:34.397 { 00:25:34.397 "name": "BaseBdev2", 00:25:34.397 "uuid": "b58b617f-98ca-54a0-802d-c3c7a06c589f", 00:25:34.397 "is_configured": true, 00:25:34.397 "data_offset": 2048, 00:25:34.397 "data_size": 63488 00:25:34.397 }, 00:25:34.397 { 00:25:34.397 "name": "BaseBdev3", 00:25:34.397 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:34.397 "is_configured": true, 00:25:34.397 "data_offset": 2048, 00:25:34.397 "data_size": 63488 00:25:34.397 }, 00:25:34.397 { 00:25:34.397 "name": "BaseBdev4", 00:25:34.397 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:34.397 "is_configured": true, 00:25:34.397 "data_offset": 2048, 00:25:34.397 "data_size": 63488 00:25:34.397 } 00:25:34.397 ] 00:25:34.397 }' 00:25:34.397 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:34.397 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:34.397 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:34.656 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:34.656 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:34.915 [2024-07-16 00:20:21.608996] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:34.915 [2024-07-16 00:20:21.613640] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1972e90 00:25:34.915 [2024-07-16 00:20:21.615192] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:34.915 00:20:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:35.852 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:35.852 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.852 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:35.852 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:35.852 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.852 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.852 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:36.112 "name": "raid_bdev1", 00:25:36.112 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:36.112 "strip_size_kb": 0, 00:25:36.112 "state": "online", 00:25:36.112 "raid_level": "raid1", 00:25:36.112 "superblock": true, 00:25:36.112 "num_base_bdevs": 4, 00:25:36.112 "num_base_bdevs_discovered": 4, 00:25:36.112 "num_base_bdevs_operational": 4, 00:25:36.112 "process": { 00:25:36.112 "type": "rebuild", 00:25:36.112 "target": "spare", 00:25:36.112 "progress": { 00:25:36.112 "blocks": 24576, 00:25:36.112 "percent": 38 00:25:36.112 } 00:25:36.112 }, 00:25:36.112 "base_bdevs_list": [ 00:25:36.112 { 00:25:36.112 "name": "spare", 00:25:36.112 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:36.112 "is_configured": true, 00:25:36.112 "data_offset": 2048, 00:25:36.112 "data_size": 63488 00:25:36.112 }, 00:25:36.112 { 00:25:36.112 "name": "BaseBdev2", 00:25:36.112 "uuid": "b58b617f-98ca-54a0-802d-c3c7a06c589f", 00:25:36.112 "is_configured": true, 00:25:36.112 "data_offset": 2048, 00:25:36.112 "data_size": 63488 00:25:36.112 }, 00:25:36.112 { 00:25:36.112 "name": "BaseBdev3", 00:25:36.112 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:36.112 "is_configured": true, 00:25:36.112 "data_offset": 2048, 00:25:36.112 "data_size": 63488 00:25:36.112 }, 00:25:36.112 { 00:25:36.112 "name": "BaseBdev4", 00:25:36.112 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:36.112 "is_configured": true, 00:25:36.112 "data_offset": 2048, 00:25:36.112 "data_size": 63488 00:25:36.112 } 00:25:36.112 ] 00:25:36.112 }' 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:36.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:36.112 00:20:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:36.371 [2024-07-16 00:20:23.187065] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:36.630 [2024-07-16 00:20:23.327890] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1972e90 00:25:36.630 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:36.630 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:36.630 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:36.630 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:36.630 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:36.630 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:36.630 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:36.630 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.630 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:36.889 "name": "raid_bdev1", 00:25:36.889 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:36.889 "strip_size_kb": 0, 00:25:36.889 "state": "online", 00:25:36.889 "raid_level": "raid1", 00:25:36.889 "superblock": true, 00:25:36.889 "num_base_bdevs": 4, 00:25:36.889 "num_base_bdevs_discovered": 3, 00:25:36.889 "num_base_bdevs_operational": 3, 00:25:36.889 "process": { 00:25:36.889 "type": "rebuild", 00:25:36.889 "target": "spare", 00:25:36.889 "progress": { 00:25:36.889 "blocks": 36864, 00:25:36.889 "percent": 58 00:25:36.889 } 00:25:36.889 }, 00:25:36.889 "base_bdevs_list": [ 00:25:36.889 { 00:25:36.889 "name": "spare", 00:25:36.889 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:36.889 "is_configured": true, 00:25:36.889 "data_offset": 2048, 00:25:36.889 "data_size": 63488 00:25:36.889 }, 00:25:36.889 { 00:25:36.889 "name": null, 00:25:36.889 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.889 "is_configured": false, 00:25:36.889 "data_offset": 2048, 00:25:36.889 "data_size": 63488 00:25:36.889 }, 00:25:36.889 { 00:25:36.889 "name": "BaseBdev3", 00:25:36.889 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:36.889 "is_configured": true, 00:25:36.889 "data_offset": 2048, 00:25:36.889 "data_size": 63488 00:25:36.889 }, 00:25:36.889 { 00:25:36.889 "name": "BaseBdev4", 00:25:36.889 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:36.889 "is_configured": true, 00:25:36.889 "data_offset": 2048, 00:25:36.889 "data_size": 63488 00:25:36.889 } 00:25:36.889 ] 00:25:36.889 }' 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=933 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.889 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.148 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:37.148 "name": "raid_bdev1", 00:25:37.148 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:37.148 "strip_size_kb": 0, 00:25:37.148 "state": "online", 00:25:37.148 "raid_level": "raid1", 00:25:37.148 "superblock": true, 00:25:37.148 "num_base_bdevs": 4, 00:25:37.148 "num_base_bdevs_discovered": 3, 00:25:37.148 "num_base_bdevs_operational": 3, 00:25:37.148 "process": { 00:25:37.148 "type": "rebuild", 00:25:37.148 "target": "spare", 00:25:37.148 "progress": { 00:25:37.148 "blocks": 43008, 00:25:37.148 "percent": 67 00:25:37.148 } 00:25:37.148 }, 00:25:37.148 "base_bdevs_list": [ 00:25:37.148 { 00:25:37.148 "name": "spare", 00:25:37.148 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:37.148 "is_configured": true, 00:25:37.148 "data_offset": 2048, 00:25:37.148 "data_size": 63488 00:25:37.148 }, 00:25:37.148 { 00:25:37.148 "name": null, 00:25:37.148 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:37.148 "is_configured": false, 00:25:37.148 "data_offset": 2048, 00:25:37.148 "data_size": 63488 00:25:37.148 }, 00:25:37.148 { 00:25:37.148 "name": "BaseBdev3", 00:25:37.148 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:37.148 "is_configured": true, 00:25:37.148 "data_offset": 2048, 00:25:37.148 "data_size": 63488 00:25:37.148 }, 00:25:37.148 { 00:25:37.148 "name": "BaseBdev4", 00:25:37.148 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:37.148 "is_configured": true, 00:25:37.148 "data_offset": 2048, 00:25:37.148 "data_size": 63488 00:25:37.148 } 00:25:37.148 ] 00:25:37.148 }' 00:25:37.148 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:37.148 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:37.148 00:20:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:37.149 00:20:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:37.149 00:20:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:38.085 [2024-07-16 00:20:24.840167] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:38.085 [2024-07-16 00:20:24.840239] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:38.085 [2024-07-16 00:20:24.840339] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:38.085 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:38.344 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:38.344 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:38.344 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:38.344 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:38.344 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:38.344 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.345 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.345 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:38.345 "name": "raid_bdev1", 00:25:38.345 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:38.345 "strip_size_kb": 0, 00:25:38.345 "state": "online", 00:25:38.345 "raid_level": "raid1", 00:25:38.345 "superblock": true, 00:25:38.345 "num_base_bdevs": 4, 00:25:38.345 "num_base_bdevs_discovered": 3, 00:25:38.345 "num_base_bdevs_operational": 3, 00:25:38.345 "base_bdevs_list": [ 00:25:38.345 { 00:25:38.345 "name": "spare", 00:25:38.345 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:38.345 "is_configured": true, 00:25:38.345 "data_offset": 2048, 00:25:38.345 "data_size": 63488 00:25:38.345 }, 00:25:38.345 { 00:25:38.345 "name": null, 00:25:38.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.345 "is_configured": false, 00:25:38.345 "data_offset": 2048, 00:25:38.345 "data_size": 63488 00:25:38.345 }, 00:25:38.345 { 00:25:38.345 "name": "BaseBdev3", 00:25:38.345 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:38.345 "is_configured": true, 00:25:38.345 "data_offset": 2048, 00:25:38.345 "data_size": 63488 00:25:38.345 }, 00:25:38.345 { 00:25:38.345 "name": "BaseBdev4", 00:25:38.345 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:38.345 "is_configured": true, 00:25:38.345 "data_offset": 2048, 00:25:38.345 "data_size": 63488 00:25:38.345 } 00:25:38.345 ] 00:25:38.345 }' 00:25:38.345 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.604 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.863 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:38.863 "name": "raid_bdev1", 00:25:38.863 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:38.863 "strip_size_kb": 0, 00:25:38.863 "state": "online", 00:25:38.863 "raid_level": "raid1", 00:25:38.863 "superblock": true, 00:25:38.863 "num_base_bdevs": 4, 00:25:38.863 "num_base_bdevs_discovered": 3, 00:25:38.863 "num_base_bdevs_operational": 3, 00:25:38.863 "base_bdevs_list": [ 00:25:38.863 { 00:25:38.863 "name": "spare", 00:25:38.863 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:38.863 "is_configured": true, 00:25:38.863 "data_offset": 2048, 00:25:38.863 "data_size": 63488 00:25:38.863 }, 00:25:38.863 { 00:25:38.863 "name": null, 00:25:38.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.863 "is_configured": false, 00:25:38.863 "data_offset": 2048, 00:25:38.863 "data_size": 63488 00:25:38.863 }, 00:25:38.863 { 00:25:38.863 "name": "BaseBdev3", 00:25:38.863 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:38.863 "is_configured": true, 00:25:38.863 "data_offset": 2048, 00:25:38.863 "data_size": 63488 00:25:38.863 }, 00:25:38.863 { 00:25:38.863 "name": "BaseBdev4", 00:25:38.863 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:38.863 "is_configured": true, 00:25:38.863 "data_offset": 2048, 00:25:38.863 "data_size": 63488 00:25:38.863 } 00:25:38.863 ] 00:25:38.863 }' 00:25:38.863 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:38.863 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:38.863 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:38.863 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:38.863 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:38.863 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:38.864 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:38.864 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:38.864 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:38.864 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:38.864 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:38.864 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:38.864 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:38.864 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:38.864 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.864 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.123 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:39.123 "name": "raid_bdev1", 00:25:39.123 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:39.123 "strip_size_kb": 0, 00:25:39.123 "state": "online", 00:25:39.123 "raid_level": "raid1", 00:25:39.123 "superblock": true, 00:25:39.123 "num_base_bdevs": 4, 00:25:39.123 "num_base_bdevs_discovered": 3, 00:25:39.123 "num_base_bdevs_operational": 3, 00:25:39.123 "base_bdevs_list": [ 00:25:39.123 { 00:25:39.123 "name": "spare", 00:25:39.123 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:39.123 "is_configured": true, 00:25:39.123 "data_offset": 2048, 00:25:39.123 "data_size": 63488 00:25:39.123 }, 00:25:39.123 { 00:25:39.123 "name": null, 00:25:39.123 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.123 "is_configured": false, 00:25:39.123 "data_offset": 2048, 00:25:39.123 "data_size": 63488 00:25:39.123 }, 00:25:39.123 { 00:25:39.123 "name": "BaseBdev3", 00:25:39.123 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:39.123 "is_configured": true, 00:25:39.123 "data_offset": 2048, 00:25:39.123 "data_size": 63488 00:25:39.123 }, 00:25:39.123 { 00:25:39.123 "name": "BaseBdev4", 00:25:39.123 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:39.123 "is_configured": true, 00:25:39.123 "data_offset": 2048, 00:25:39.123 "data_size": 63488 00:25:39.123 } 00:25:39.123 ] 00:25:39.123 }' 00:25:39.123 00:20:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:39.123 00:20:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:39.690 00:20:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:39.950 [2024-07-16 00:20:26.810023] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:39.950 [2024-07-16 00:20:26.810052] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:39.950 [2024-07-16 00:20:26.810110] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:39.950 [2024-07-16 00:20:26.810179] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:39.950 [2024-07-16 00:20:26.810192] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19728a0 name raid_bdev1, state offline 00:25:39.950 00:20:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.950 00:20:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:40.209 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:40.468 /dev/nbd0 00:25:40.468 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:40.468 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:40.468 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:40.468 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:40.468 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:40.468 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:40.468 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:40.469 1+0 records in 00:25:40.469 1+0 records out 00:25:40.469 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276422 s, 14.8 MB/s 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:40.469 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:40.728 /dev/nbd1 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:40.728 1+0 records in 00:25:40.728 1+0 records out 00:25:40.728 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000355124 s, 11.5 MB/s 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:40.728 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:40.987 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:40.987 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:40.987 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:40.987 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:40.987 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:40.987 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:41.245 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:41.245 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:41.245 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:41.245 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:41.245 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:41.245 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:41.245 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:41.245 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:41.245 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:41.245 00:20:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:41.503 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:41.503 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:41.503 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:41.503 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:41.503 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:41.503 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:41.503 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:41.503 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:41.503 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:41.503 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:41.761 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:42.019 [2024-07-16 00:20:28.723360] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:42.019 [2024-07-16 00:20:28.723406] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:42.019 [2024-07-16 00:20:28.723430] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19ecb40 00:25:42.019 [2024-07-16 00:20:28.723443] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:42.019 [2024-07-16 00:20:28.725099] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:42.019 [2024-07-16 00:20:28.725128] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:42.019 [2024-07-16 00:20:28.725207] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:42.019 [2024-07-16 00:20:28.725233] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:42.019 [2024-07-16 00:20:28.725338] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:42.019 [2024-07-16 00:20:28.725410] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:42.019 spare 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.019 00:20:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.019 [2024-07-16 00:20:28.825728] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1976ba0 00:25:42.019 [2024-07-16 00:20:28.825743] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:42.019 [2024-07-16 00:20:28.825966] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1973560 00:25:42.019 [2024-07-16 00:20:28.826126] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1976ba0 00:25:42.019 [2024-07-16 00:20:28.826136] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1976ba0 00:25:42.019 [2024-07-16 00:20:28.826246] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:42.277 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:42.277 "name": "raid_bdev1", 00:25:42.277 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:42.277 "strip_size_kb": 0, 00:25:42.277 "state": "online", 00:25:42.277 "raid_level": "raid1", 00:25:42.277 "superblock": true, 00:25:42.277 "num_base_bdevs": 4, 00:25:42.277 "num_base_bdevs_discovered": 3, 00:25:42.277 "num_base_bdevs_operational": 3, 00:25:42.277 "base_bdevs_list": [ 00:25:42.277 { 00:25:42.277 "name": "spare", 00:25:42.277 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:42.277 "is_configured": true, 00:25:42.277 "data_offset": 2048, 00:25:42.277 "data_size": 63488 00:25:42.277 }, 00:25:42.277 { 00:25:42.277 "name": null, 00:25:42.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.277 "is_configured": false, 00:25:42.277 "data_offset": 2048, 00:25:42.277 "data_size": 63488 00:25:42.277 }, 00:25:42.277 { 00:25:42.277 "name": "BaseBdev3", 00:25:42.277 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:42.277 "is_configured": true, 00:25:42.277 "data_offset": 2048, 00:25:42.277 "data_size": 63488 00:25:42.277 }, 00:25:42.277 { 00:25:42.277 "name": "BaseBdev4", 00:25:42.277 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:42.277 "is_configured": true, 00:25:42.277 "data_offset": 2048, 00:25:42.277 "data_size": 63488 00:25:42.277 } 00:25:42.277 ] 00:25:42.277 }' 00:25:42.277 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:42.277 00:20:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:42.841 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:42.841 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:42.841 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:42.841 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:42.841 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:42.841 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.841 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.098 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:43.098 "name": "raid_bdev1", 00:25:43.098 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:43.098 "strip_size_kb": 0, 00:25:43.098 "state": "online", 00:25:43.098 "raid_level": "raid1", 00:25:43.098 "superblock": true, 00:25:43.098 "num_base_bdevs": 4, 00:25:43.098 "num_base_bdevs_discovered": 3, 00:25:43.098 "num_base_bdevs_operational": 3, 00:25:43.098 "base_bdevs_list": [ 00:25:43.098 { 00:25:43.098 "name": "spare", 00:25:43.098 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:43.098 "is_configured": true, 00:25:43.098 "data_offset": 2048, 00:25:43.098 "data_size": 63488 00:25:43.098 }, 00:25:43.098 { 00:25:43.098 "name": null, 00:25:43.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.098 "is_configured": false, 00:25:43.098 "data_offset": 2048, 00:25:43.098 "data_size": 63488 00:25:43.098 }, 00:25:43.098 { 00:25:43.098 "name": "BaseBdev3", 00:25:43.098 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:43.098 "is_configured": true, 00:25:43.098 "data_offset": 2048, 00:25:43.098 "data_size": 63488 00:25:43.098 }, 00:25:43.098 { 00:25:43.098 "name": "BaseBdev4", 00:25:43.098 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:43.098 "is_configured": true, 00:25:43.098 "data_offset": 2048, 00:25:43.098 "data_size": 63488 00:25:43.098 } 00:25:43.098 ] 00:25:43.098 }' 00:25:43.098 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:43.098 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:43.098 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:43.098 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:43.098 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.098 00:20:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:43.356 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:43.356 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:43.615 [2024-07-16 00:20:30.460332] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.615 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.873 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:43.873 "name": "raid_bdev1", 00:25:43.873 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:43.873 "strip_size_kb": 0, 00:25:43.873 "state": "online", 00:25:43.873 "raid_level": "raid1", 00:25:43.873 "superblock": true, 00:25:43.873 "num_base_bdevs": 4, 00:25:43.873 "num_base_bdevs_discovered": 2, 00:25:43.873 "num_base_bdevs_operational": 2, 00:25:43.873 "base_bdevs_list": [ 00:25:43.873 { 00:25:43.873 "name": null, 00:25:43.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.873 "is_configured": false, 00:25:43.873 "data_offset": 2048, 00:25:43.873 "data_size": 63488 00:25:43.873 }, 00:25:43.873 { 00:25:43.873 "name": null, 00:25:43.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.873 "is_configured": false, 00:25:43.873 "data_offset": 2048, 00:25:43.873 "data_size": 63488 00:25:43.873 }, 00:25:43.873 { 00:25:43.873 "name": "BaseBdev3", 00:25:43.873 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:43.873 "is_configured": true, 00:25:43.873 "data_offset": 2048, 00:25:43.873 "data_size": 63488 00:25:43.873 }, 00:25:43.873 { 00:25:43.873 "name": "BaseBdev4", 00:25:43.873 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:43.873 "is_configured": true, 00:25:43.873 "data_offset": 2048, 00:25:43.873 "data_size": 63488 00:25:43.873 } 00:25:43.873 ] 00:25:43.873 }' 00:25:43.873 00:20:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:43.873 00:20:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:44.439 00:20:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:44.697 [2024-07-16 00:20:31.543207] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:44.697 [2024-07-16 00:20:31.543361] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:44.697 [2024-07-16 00:20:31.543378] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:44.697 [2024-07-16 00:20:31.543406] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:44.697 [2024-07-16 00:20:31.547398] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1976740 00:25:44.697 [2024-07-16 00:20:31.549762] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:44.697 00:20:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:45.654 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:45.654 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:45.654 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:45.654 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:45.654 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:45.937 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.937 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.937 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:45.937 "name": "raid_bdev1", 00:25:45.937 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:45.937 "strip_size_kb": 0, 00:25:45.937 "state": "online", 00:25:45.937 "raid_level": "raid1", 00:25:45.937 "superblock": true, 00:25:45.937 "num_base_bdevs": 4, 00:25:45.937 "num_base_bdevs_discovered": 3, 00:25:45.937 "num_base_bdevs_operational": 3, 00:25:45.937 "process": { 00:25:45.937 "type": "rebuild", 00:25:45.937 "target": "spare", 00:25:45.937 "progress": { 00:25:45.937 "blocks": 24576, 00:25:45.937 "percent": 38 00:25:45.937 } 00:25:45.937 }, 00:25:45.937 "base_bdevs_list": [ 00:25:45.937 { 00:25:45.937 "name": "spare", 00:25:45.937 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:45.937 "is_configured": true, 00:25:45.937 "data_offset": 2048, 00:25:45.937 "data_size": 63488 00:25:45.937 }, 00:25:45.937 { 00:25:45.937 "name": null, 00:25:45.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.937 "is_configured": false, 00:25:45.937 "data_offset": 2048, 00:25:45.937 "data_size": 63488 00:25:45.937 }, 00:25:45.937 { 00:25:45.937 "name": "BaseBdev3", 00:25:45.937 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:45.937 "is_configured": true, 00:25:45.937 "data_offset": 2048, 00:25:45.937 "data_size": 63488 00:25:45.937 }, 00:25:45.937 { 00:25:45.937 "name": "BaseBdev4", 00:25:45.937 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:45.937 "is_configured": true, 00:25:45.937 "data_offset": 2048, 00:25:45.937 "data_size": 63488 00:25:45.937 } 00:25:45.937 ] 00:25:45.937 }' 00:25:45.937 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:45.937 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:46.196 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:46.196 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:46.196 00:20:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:46.459 [2024-07-16 00:20:33.161186] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:46.459 [2024-07-16 00:20:33.162589] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:46.459 [2024-07-16 00:20:33.162633] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:46.459 [2024-07-16 00:20:33.162649] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:46.459 [2024-07-16 00:20:33.162658] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:46.459 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.721 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:46.721 "name": "raid_bdev1", 00:25:46.721 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:46.721 "strip_size_kb": 0, 00:25:46.721 "state": "online", 00:25:46.721 "raid_level": "raid1", 00:25:46.721 "superblock": true, 00:25:46.721 "num_base_bdevs": 4, 00:25:46.721 "num_base_bdevs_discovered": 2, 00:25:46.721 "num_base_bdevs_operational": 2, 00:25:46.721 "base_bdevs_list": [ 00:25:46.721 { 00:25:46.721 "name": null, 00:25:46.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.721 "is_configured": false, 00:25:46.721 "data_offset": 2048, 00:25:46.721 "data_size": 63488 00:25:46.721 }, 00:25:46.721 { 00:25:46.721 "name": null, 00:25:46.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.721 "is_configured": false, 00:25:46.721 "data_offset": 2048, 00:25:46.721 "data_size": 63488 00:25:46.721 }, 00:25:46.721 { 00:25:46.721 "name": "BaseBdev3", 00:25:46.721 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:46.721 "is_configured": true, 00:25:46.721 "data_offset": 2048, 00:25:46.721 "data_size": 63488 00:25:46.721 }, 00:25:46.721 { 00:25:46.721 "name": "BaseBdev4", 00:25:46.721 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:46.721 "is_configured": true, 00:25:46.721 "data_offset": 2048, 00:25:46.721 "data_size": 63488 00:25:46.721 } 00:25:46.721 ] 00:25:46.721 }' 00:25:46.721 00:20:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:46.721 00:20:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:47.288 00:20:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:47.546 [2024-07-16 00:20:34.278380] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:47.546 [2024-07-16 00:20:34.278435] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:47.546 [2024-07-16 00:20:34.278460] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1977010 00:25:47.546 [2024-07-16 00:20:34.278473] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:47.546 [2024-07-16 00:20:34.278856] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:47.546 [2024-07-16 00:20:34.278875] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:47.546 [2024-07-16 00:20:34.278965] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:47.546 [2024-07-16 00:20:34.278977] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:47.546 [2024-07-16 00:20:34.278988] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:47.546 [2024-07-16 00:20:34.279006] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:47.547 [2024-07-16 00:20:34.282994] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19f2420 00:25:47.547 spare 00:25:47.547 [2024-07-16 00:20:34.284471] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:47.547 00:20:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:48.482 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:48.482 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:48.482 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:48.482 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:48.482 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:48.482 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.482 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.740 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:48.740 "name": "raid_bdev1", 00:25:48.740 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:48.740 "strip_size_kb": 0, 00:25:48.740 "state": "online", 00:25:48.740 "raid_level": "raid1", 00:25:48.740 "superblock": true, 00:25:48.740 "num_base_bdevs": 4, 00:25:48.740 "num_base_bdevs_discovered": 3, 00:25:48.740 "num_base_bdevs_operational": 3, 00:25:48.740 "process": { 00:25:48.740 "type": "rebuild", 00:25:48.740 "target": "spare", 00:25:48.740 "progress": { 00:25:48.740 "blocks": 24576, 00:25:48.740 "percent": 38 00:25:48.740 } 00:25:48.740 }, 00:25:48.740 "base_bdevs_list": [ 00:25:48.740 { 00:25:48.740 "name": "spare", 00:25:48.740 "uuid": "bb6411c8-620d-5a97-a631-cce78c9ca561", 00:25:48.740 "is_configured": true, 00:25:48.740 "data_offset": 2048, 00:25:48.740 "data_size": 63488 00:25:48.740 }, 00:25:48.740 { 00:25:48.740 "name": null, 00:25:48.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.740 "is_configured": false, 00:25:48.740 "data_offset": 2048, 00:25:48.740 "data_size": 63488 00:25:48.740 }, 00:25:48.740 { 00:25:48.740 "name": "BaseBdev3", 00:25:48.740 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:48.740 "is_configured": true, 00:25:48.740 "data_offset": 2048, 00:25:48.740 "data_size": 63488 00:25:48.740 }, 00:25:48.740 { 00:25:48.740 "name": "BaseBdev4", 00:25:48.740 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:48.740 "is_configured": true, 00:25:48.740 "data_offset": 2048, 00:25:48.740 "data_size": 63488 00:25:48.740 } 00:25:48.740 ] 00:25:48.740 }' 00:25:48.740 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:48.740 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:48.740 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:48.740 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:48.740 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:48.997 [2024-07-16 00:20:35.860697] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:48.997 [2024-07-16 00:20:35.897189] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:48.997 [2024-07-16 00:20:35.897234] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:48.997 [2024-07-16 00:20:35.897250] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:48.997 [2024-07-16 00:20:35.897258] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.997 00:20:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.256 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:49.256 "name": "raid_bdev1", 00:25:49.256 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:49.256 "strip_size_kb": 0, 00:25:49.256 "state": "online", 00:25:49.256 "raid_level": "raid1", 00:25:49.256 "superblock": true, 00:25:49.256 "num_base_bdevs": 4, 00:25:49.256 "num_base_bdevs_discovered": 2, 00:25:49.256 "num_base_bdevs_operational": 2, 00:25:49.256 "base_bdevs_list": [ 00:25:49.256 { 00:25:49.256 "name": null, 00:25:49.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.256 "is_configured": false, 00:25:49.256 "data_offset": 2048, 00:25:49.256 "data_size": 63488 00:25:49.256 }, 00:25:49.256 { 00:25:49.256 "name": null, 00:25:49.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.256 "is_configured": false, 00:25:49.256 "data_offset": 2048, 00:25:49.256 "data_size": 63488 00:25:49.256 }, 00:25:49.256 { 00:25:49.256 "name": "BaseBdev3", 00:25:49.256 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:49.256 "is_configured": true, 00:25:49.256 "data_offset": 2048, 00:25:49.256 "data_size": 63488 00:25:49.256 }, 00:25:49.256 { 00:25:49.256 "name": "BaseBdev4", 00:25:49.256 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:49.256 "is_configured": true, 00:25:49.256 "data_offset": 2048, 00:25:49.256 "data_size": 63488 00:25:49.256 } 00:25:49.256 ] 00:25:49.256 }' 00:25:49.256 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:49.256 00:20:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:50.191 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:50.191 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:50.191 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:50.191 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:50.191 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:50.191 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.191 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.191 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:50.191 "name": "raid_bdev1", 00:25:50.191 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:50.191 "strip_size_kb": 0, 00:25:50.191 "state": "online", 00:25:50.191 "raid_level": "raid1", 00:25:50.191 "superblock": true, 00:25:50.191 "num_base_bdevs": 4, 00:25:50.191 "num_base_bdevs_discovered": 2, 00:25:50.191 "num_base_bdevs_operational": 2, 00:25:50.191 "base_bdevs_list": [ 00:25:50.191 { 00:25:50.191 "name": null, 00:25:50.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:50.191 "is_configured": false, 00:25:50.191 "data_offset": 2048, 00:25:50.191 "data_size": 63488 00:25:50.191 }, 00:25:50.191 { 00:25:50.191 "name": null, 00:25:50.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:50.191 "is_configured": false, 00:25:50.191 "data_offset": 2048, 00:25:50.192 "data_size": 63488 00:25:50.192 }, 00:25:50.192 { 00:25:50.192 "name": "BaseBdev3", 00:25:50.192 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:50.192 "is_configured": true, 00:25:50.192 "data_offset": 2048, 00:25:50.192 "data_size": 63488 00:25:50.192 }, 00:25:50.192 { 00:25:50.192 "name": "BaseBdev4", 00:25:50.192 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:50.192 "is_configured": true, 00:25:50.192 "data_offset": 2048, 00:25:50.192 "data_size": 63488 00:25:50.192 } 00:25:50.192 ] 00:25:50.192 }' 00:25:50.192 00:20:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:50.192 00:20:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:50.192 00:20:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:50.192 00:20:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:50.192 00:20:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:50.450 00:20:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:50.709 [2024-07-16 00:20:37.545562] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:50.709 [2024-07-16 00:20:37.545621] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:50.709 [2024-07-16 00:20:37.545646] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19f2e30 00:25:50.709 [2024-07-16 00:20:37.545659] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:50.709 [2024-07-16 00:20:37.546047] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:50.709 [2024-07-16 00:20:37.546066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:50.709 [2024-07-16 00:20:37.546134] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:50.709 [2024-07-16 00:20:37.546147] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:50.709 [2024-07-16 00:20:37.546158] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:50.709 BaseBdev1 00:25:50.709 00:20:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:51.643 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.644 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.902 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:51.902 "name": "raid_bdev1", 00:25:51.902 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:51.902 "strip_size_kb": 0, 00:25:51.902 "state": "online", 00:25:51.902 "raid_level": "raid1", 00:25:51.902 "superblock": true, 00:25:51.902 "num_base_bdevs": 4, 00:25:51.902 "num_base_bdevs_discovered": 2, 00:25:51.902 "num_base_bdevs_operational": 2, 00:25:51.902 "base_bdevs_list": [ 00:25:51.902 { 00:25:51.902 "name": null, 00:25:51.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.902 "is_configured": false, 00:25:51.902 "data_offset": 2048, 00:25:51.902 "data_size": 63488 00:25:51.902 }, 00:25:51.902 { 00:25:51.902 "name": null, 00:25:51.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.902 "is_configured": false, 00:25:51.902 "data_offset": 2048, 00:25:51.902 "data_size": 63488 00:25:51.902 }, 00:25:51.902 { 00:25:51.902 "name": "BaseBdev3", 00:25:51.902 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:51.902 "is_configured": true, 00:25:51.902 "data_offset": 2048, 00:25:51.902 "data_size": 63488 00:25:51.902 }, 00:25:51.902 { 00:25:51.902 "name": "BaseBdev4", 00:25:51.902 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:51.902 "is_configured": true, 00:25:51.902 "data_offset": 2048, 00:25:51.902 "data_size": 63488 00:25:51.902 } 00:25:51.902 ] 00:25:51.902 }' 00:25:51.902 00:20:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:51.902 00:20:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:52.469 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:52.469 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:52.469 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:52.469 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:52.469 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:52.469 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.469 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.727 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:52.727 "name": "raid_bdev1", 00:25:52.727 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:52.727 "strip_size_kb": 0, 00:25:52.727 "state": "online", 00:25:52.727 "raid_level": "raid1", 00:25:52.727 "superblock": true, 00:25:52.727 "num_base_bdevs": 4, 00:25:52.727 "num_base_bdevs_discovered": 2, 00:25:52.727 "num_base_bdevs_operational": 2, 00:25:52.727 "base_bdevs_list": [ 00:25:52.727 { 00:25:52.727 "name": null, 00:25:52.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.727 "is_configured": false, 00:25:52.727 "data_offset": 2048, 00:25:52.727 "data_size": 63488 00:25:52.727 }, 00:25:52.727 { 00:25:52.727 "name": null, 00:25:52.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.727 "is_configured": false, 00:25:52.727 "data_offset": 2048, 00:25:52.727 "data_size": 63488 00:25:52.727 }, 00:25:52.727 { 00:25:52.727 "name": "BaseBdev3", 00:25:52.727 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:52.727 "is_configured": true, 00:25:52.727 "data_offset": 2048, 00:25:52.727 "data_size": 63488 00:25:52.727 }, 00:25:52.727 { 00:25:52.727 "name": "BaseBdev4", 00:25:52.727 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:52.727 "is_configured": true, 00:25:52.727 "data_offset": 2048, 00:25:52.727 "data_size": 63488 00:25:52.727 } 00:25:52.727 ] 00:25:52.727 }' 00:25:52.727 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:52.986 00:20:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:53.244 [2024-07-16 00:20:40.036179] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:53.244 [2024-07-16 00:20:40.036313] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:53.244 [2024-07-16 00:20:40.036330] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:53.244 request: 00:25:53.244 { 00:25:53.244 "base_bdev": "BaseBdev1", 00:25:53.244 "raid_bdev": "raid_bdev1", 00:25:53.244 "method": "bdev_raid_add_base_bdev", 00:25:53.244 "req_id": 1 00:25:53.244 } 00:25:53.244 Got JSON-RPC error response 00:25:53.244 response: 00:25:53.244 { 00:25:53.244 "code": -22, 00:25:53.244 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:53.244 } 00:25:53.244 00:20:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:25:53.244 00:20:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:53.244 00:20:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:53.244 00:20:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:53.244 00:20:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.180 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.439 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:54.439 "name": "raid_bdev1", 00:25:54.439 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:54.439 "strip_size_kb": 0, 00:25:54.439 "state": "online", 00:25:54.439 "raid_level": "raid1", 00:25:54.439 "superblock": true, 00:25:54.439 "num_base_bdevs": 4, 00:25:54.439 "num_base_bdevs_discovered": 2, 00:25:54.439 "num_base_bdevs_operational": 2, 00:25:54.439 "base_bdevs_list": [ 00:25:54.439 { 00:25:54.439 "name": null, 00:25:54.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:54.439 "is_configured": false, 00:25:54.439 "data_offset": 2048, 00:25:54.439 "data_size": 63488 00:25:54.439 }, 00:25:54.439 { 00:25:54.439 "name": null, 00:25:54.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:54.439 "is_configured": false, 00:25:54.439 "data_offset": 2048, 00:25:54.439 "data_size": 63488 00:25:54.439 }, 00:25:54.439 { 00:25:54.439 "name": "BaseBdev3", 00:25:54.439 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:54.439 "is_configured": true, 00:25:54.439 "data_offset": 2048, 00:25:54.439 "data_size": 63488 00:25:54.439 }, 00:25:54.439 { 00:25:54.439 "name": "BaseBdev4", 00:25:54.439 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:54.439 "is_configured": true, 00:25:54.439 "data_offset": 2048, 00:25:54.439 "data_size": 63488 00:25:54.439 } 00:25:54.439 ] 00:25:54.439 }' 00:25:54.439 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:54.439 00:20:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:55.377 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:55.377 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:55.377 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:55.377 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:55.377 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:55.377 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.377 00:20:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.377 00:20:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:55.377 "name": "raid_bdev1", 00:25:55.377 "uuid": "37901f1c-fee3-4f94-a065-f3c954fdbe4d", 00:25:55.377 "strip_size_kb": 0, 00:25:55.377 "state": "online", 00:25:55.377 "raid_level": "raid1", 00:25:55.377 "superblock": true, 00:25:55.377 "num_base_bdevs": 4, 00:25:55.377 "num_base_bdevs_discovered": 2, 00:25:55.377 "num_base_bdevs_operational": 2, 00:25:55.377 "base_bdevs_list": [ 00:25:55.377 { 00:25:55.377 "name": null, 00:25:55.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.377 "is_configured": false, 00:25:55.377 "data_offset": 2048, 00:25:55.377 "data_size": 63488 00:25:55.377 }, 00:25:55.377 { 00:25:55.377 "name": null, 00:25:55.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.377 "is_configured": false, 00:25:55.377 "data_offset": 2048, 00:25:55.377 "data_size": 63488 00:25:55.377 }, 00:25:55.377 { 00:25:55.377 "name": "BaseBdev3", 00:25:55.377 "uuid": "c8a51b3e-9f55-5b01-a81c-69b5b9af8606", 00:25:55.377 "is_configured": true, 00:25:55.377 "data_offset": 2048, 00:25:55.377 "data_size": 63488 00:25:55.377 }, 00:25:55.377 { 00:25:55.377 "name": "BaseBdev4", 00:25:55.377 "uuid": "d224facf-8441-5712-acf8-25498c0ae472", 00:25:55.377 "is_configured": true, 00:25:55.377 "data_offset": 2048, 00:25:55.377 "data_size": 63488 00:25:55.377 } 00:25:55.377 ] 00:25:55.377 }' 00:25:55.377 00:20:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:55.377 00:20:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:55.377 00:20:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 3617132 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3617132 ']' 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 3617132 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3617132 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3617132' 00:25:55.635 killing process with pid 3617132 00:25:55.635 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 3617132 00:25:55.635 Received shutdown signal, test time was about 60.000000 seconds 00:25:55.635 00:25:55.635 Latency(us) 00:25:55.635 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:55.636 =================================================================================================================== 00:25:55.636 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:55.636 [2024-07-16 00:20:42.379771] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:55.636 [2024-07-16 00:20:42.379874] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:55.636 [2024-07-16 00:20:42.379938] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to fr 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 3617132 00:25:55.636 ee all in destruct 00:25:55.636 [2024-07-16 00:20:42.379957] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1976ba0 name raid_bdev1, state offline 00:25:55.636 [2024-07-16 00:20:42.429507] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:25:55.894 00:25:55.894 real 0m38.648s 00:25:55.894 user 0m56.146s 00:25:55.894 sys 0m7.079s 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:55.894 ************************************ 00:25:55.894 END TEST raid_rebuild_test_sb 00:25:55.894 ************************************ 00:25:55.894 00:20:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:55.894 00:20:42 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:25:55.894 00:20:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:55.894 00:20:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:55.894 00:20:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:55.894 ************************************ 00:25:55.894 START TEST raid_rebuild_test_io 00:25:55.894 ************************************ 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:55.894 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=3622548 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 3622548 /var/tmp/spdk-raid.sock 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 3622548 ']' 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:55.895 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:55.895 00:20:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:55.895 [2024-07-16 00:20:42.818658] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:25:55.895 [2024-07-16 00:20:42.818729] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3622548 ] 00:25:55.895 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:55.895 Zero copy mechanism will not be used. 00:25:56.153 [2024-07-16 00:20:42.941682] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:56.153 [2024-07-16 00:20:43.053266] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:56.412 [2024-07-16 00:20:43.116773] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:56.412 [2024-07-16 00:20:43.116797] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:56.977 00:20:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:56.977 00:20:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:25:56.977 00:20:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:56.977 00:20:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:57.235 BaseBdev1_malloc 00:25:57.235 00:20:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:57.493 [2024-07-16 00:20:44.303143] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:57.493 [2024-07-16 00:20:44.303189] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:57.493 [2024-07-16 00:20:44.303220] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21f9d40 00:25:57.493 [2024-07-16 00:20:44.303233] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:57.493 [2024-07-16 00:20:44.305021] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:57.493 [2024-07-16 00:20:44.305049] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:57.493 BaseBdev1 00:25:57.493 00:20:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:57.493 00:20:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:57.752 BaseBdev2_malloc 00:25:57.752 00:20:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:58.010 [2024-07-16 00:20:44.789317] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:58.010 [2024-07-16 00:20:44.789361] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:58.010 [2024-07-16 00:20:44.789387] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21fa860 00:25:58.010 [2024-07-16 00:20:44.789401] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:58.010 [2024-07-16 00:20:44.790950] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:58.010 [2024-07-16 00:20:44.790977] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:58.010 BaseBdev2 00:25:58.010 00:20:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:58.010 00:20:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:58.268 BaseBdev3_malloc 00:25:58.268 00:20:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:58.527 [2024-07-16 00:20:45.275186] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:58.527 [2024-07-16 00:20:45.275232] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:58.527 [2024-07-16 00:20:45.275254] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23a78f0 00:25:58.527 [2024-07-16 00:20:45.275266] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:58.527 [2024-07-16 00:20:45.276817] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:58.527 [2024-07-16 00:20:45.276845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:58.527 BaseBdev3 00:25:58.527 00:20:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:58.527 00:20:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:58.785 BaseBdev4_malloc 00:25:58.785 00:20:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:59.044 [2024-07-16 00:20:45.766366] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:59.044 [2024-07-16 00:20:45.766412] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:59.044 [2024-07-16 00:20:45.766436] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23a6ad0 00:25:59.044 [2024-07-16 00:20:45.766450] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:59.044 [2024-07-16 00:20:45.768010] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:59.044 [2024-07-16 00:20:45.768039] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:59.044 BaseBdev4 00:25:59.044 00:20:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:59.611 spare_malloc 00:25:59.611 00:20:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:59.611 spare_delay 00:25:59.611 00:20:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:59.869 [2024-07-16 00:20:46.766814] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:59.869 [2024-07-16 00:20:46.766858] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:59.869 [2024-07-16 00:20:46.766880] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23ab5b0 00:25:59.869 [2024-07-16 00:20:46.766893] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:59.869 [2024-07-16 00:20:46.768441] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:59.869 [2024-07-16 00:20:46.768470] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:59.869 spare 00:25:59.869 00:20:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:00.437 [2024-07-16 00:20:47.268153] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:00.437 [2024-07-16 00:20:47.269500] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:00.437 [2024-07-16 00:20:47.269557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:00.437 [2024-07-16 00:20:47.269603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:00.437 [2024-07-16 00:20:47.269686] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x232a8a0 00:26:00.437 [2024-07-16 00:20:47.269696] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:00.437 [2024-07-16 00:20:47.269914] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23a4e10 00:26:00.437 [2024-07-16 00:20:47.270078] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x232a8a0 00:26:00.437 [2024-07-16 00:20:47.270088] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x232a8a0 00:26:00.437 [2024-07-16 00:20:47.270210] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.437 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.004 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.004 "name": "raid_bdev1", 00:26:01.004 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:01.004 "strip_size_kb": 0, 00:26:01.004 "state": "online", 00:26:01.004 "raid_level": "raid1", 00:26:01.004 "superblock": false, 00:26:01.004 "num_base_bdevs": 4, 00:26:01.004 "num_base_bdevs_discovered": 4, 00:26:01.004 "num_base_bdevs_operational": 4, 00:26:01.004 "base_bdevs_list": [ 00:26:01.004 { 00:26:01.004 "name": "BaseBdev1", 00:26:01.004 "uuid": "52e01b9e-25b9-5047-98e4-3d2808517ba5", 00:26:01.004 "is_configured": true, 00:26:01.004 "data_offset": 0, 00:26:01.004 "data_size": 65536 00:26:01.004 }, 00:26:01.004 { 00:26:01.004 "name": "BaseBdev2", 00:26:01.004 "uuid": "89af74bb-6de0-59f0-81c0-4b44343a2157", 00:26:01.004 "is_configured": true, 00:26:01.004 "data_offset": 0, 00:26:01.004 "data_size": 65536 00:26:01.004 }, 00:26:01.004 { 00:26:01.004 "name": "BaseBdev3", 00:26:01.004 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:01.004 "is_configured": true, 00:26:01.004 "data_offset": 0, 00:26:01.004 "data_size": 65536 00:26:01.004 }, 00:26:01.004 { 00:26:01.004 "name": "BaseBdev4", 00:26:01.004 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:01.004 "is_configured": true, 00:26:01.004 "data_offset": 0, 00:26:01.004 "data_size": 65536 00:26:01.004 } 00:26:01.004 ] 00:26:01.004 }' 00:26:01.004 00:20:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.004 00:20:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:01.939 00:20:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:01.940 00:20:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:02.197 [2024-07-16 00:20:49.053176] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:02.197 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:26:02.197 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.197 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:02.764 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:26:02.764 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:02.764 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:02.764 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:02.764 [2024-07-16 00:20:49.700667] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2330970 00:26:02.764 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:02.764 Zero copy mechanism will not be used. 00:26:02.764 Running I/O for 60 seconds... 00:26:03.022 [2024-07-16 00:20:49.818789] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:03.022 [2024-07-16 00:20:49.818974] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2330970 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.022 00:20:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.322 00:20:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:03.322 "name": "raid_bdev1", 00:26:03.322 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:03.322 "strip_size_kb": 0, 00:26:03.322 "state": "online", 00:26:03.322 "raid_level": "raid1", 00:26:03.322 "superblock": false, 00:26:03.322 "num_base_bdevs": 4, 00:26:03.322 "num_base_bdevs_discovered": 3, 00:26:03.322 "num_base_bdevs_operational": 3, 00:26:03.322 "base_bdevs_list": [ 00:26:03.322 { 00:26:03.322 "name": null, 00:26:03.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:03.322 "is_configured": false, 00:26:03.322 "data_offset": 0, 00:26:03.322 "data_size": 65536 00:26:03.322 }, 00:26:03.322 { 00:26:03.322 "name": "BaseBdev2", 00:26:03.322 "uuid": "89af74bb-6de0-59f0-81c0-4b44343a2157", 00:26:03.322 "is_configured": true, 00:26:03.322 "data_offset": 0, 00:26:03.322 "data_size": 65536 00:26:03.322 }, 00:26:03.322 { 00:26:03.322 "name": "BaseBdev3", 00:26:03.322 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:03.322 "is_configured": true, 00:26:03.322 "data_offset": 0, 00:26:03.322 "data_size": 65536 00:26:03.322 }, 00:26:03.322 { 00:26:03.322 "name": "BaseBdev4", 00:26:03.322 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:03.322 "is_configured": true, 00:26:03.322 "data_offset": 0, 00:26:03.322 "data_size": 65536 00:26:03.322 } 00:26:03.322 ] 00:26:03.322 }' 00:26:03.322 00:20:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:03.322 00:20:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:03.897 00:20:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:04.155 [2024-07-16 00:20:51.005776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:04.155 00:20:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:04.155 [2024-07-16 00:20:51.089279] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f00fa0 00:26:04.155 [2024-07-16 00:20:51.091696] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:04.413 [2024-07-16 00:20:51.204766] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:04.413 [2024-07-16 00:20:51.205170] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:04.672 [2024-07-16 00:20:51.447087] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:04.672 [2024-07-16 00:20:51.447386] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:04.930 [2024-07-16 00:20:51.714183] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:05.188 [2024-07-16 00:20:51.947897] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:05.189 [2024-07-16 00:20:51.948141] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:05.189 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:05.189 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:05.189 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:05.189 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:05.189 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:05.189 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.189 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.448 [2024-07-16 00:20:52.204012] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:05.448 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:05.448 "name": "raid_bdev1", 00:26:05.448 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:05.448 "strip_size_kb": 0, 00:26:05.448 "state": "online", 00:26:05.448 "raid_level": "raid1", 00:26:05.448 "superblock": false, 00:26:05.448 "num_base_bdevs": 4, 00:26:05.448 "num_base_bdevs_discovered": 4, 00:26:05.448 "num_base_bdevs_operational": 4, 00:26:05.448 "process": { 00:26:05.448 "type": "rebuild", 00:26:05.448 "target": "spare", 00:26:05.448 "progress": { 00:26:05.448 "blocks": 14336, 00:26:05.448 "percent": 21 00:26:05.448 } 00:26:05.448 }, 00:26:05.448 "base_bdevs_list": [ 00:26:05.448 { 00:26:05.448 "name": "spare", 00:26:05.448 "uuid": "eb8195f5-c6bb-5eb1-8c42-1f0a12186199", 00:26:05.448 "is_configured": true, 00:26:05.448 "data_offset": 0, 00:26:05.448 "data_size": 65536 00:26:05.448 }, 00:26:05.448 { 00:26:05.448 "name": "BaseBdev2", 00:26:05.448 "uuid": "89af74bb-6de0-59f0-81c0-4b44343a2157", 00:26:05.448 "is_configured": true, 00:26:05.448 "data_offset": 0, 00:26:05.448 "data_size": 65536 00:26:05.448 }, 00:26:05.448 { 00:26:05.448 "name": "BaseBdev3", 00:26:05.448 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:05.448 "is_configured": true, 00:26:05.448 "data_offset": 0, 00:26:05.448 "data_size": 65536 00:26:05.448 }, 00:26:05.448 { 00:26:05.448 "name": "BaseBdev4", 00:26:05.448 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:05.448 "is_configured": true, 00:26:05.448 "data_offset": 0, 00:26:05.448 "data_size": 65536 00:26:05.448 } 00:26:05.448 ] 00:26:05.448 }' 00:26:05.448 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:05.448 [2024-07-16 00:20:52.354551] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:05.448 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:05.448 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:05.707 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:05.707 00:20:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:05.707 [2024-07-16 00:20:52.617961] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:05.965 [2024-07-16 00:20:52.842266] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:05.965 [2024-07-16 00:20:52.906594] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:06.224 [2024-07-16 00:20:53.057915] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:06.224 [2024-07-16 00:20:53.078905] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:06.224 [2024-07-16 00:20:53.078948] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:06.224 [2024-07-16 00:20:53.078960] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:06.224 [2024-07-16 00:20:53.111911] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2330970 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.224 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.791 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:06.791 "name": "raid_bdev1", 00:26:06.791 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:06.791 "strip_size_kb": 0, 00:26:06.791 "state": "online", 00:26:06.791 "raid_level": "raid1", 00:26:06.791 "superblock": false, 00:26:06.791 "num_base_bdevs": 4, 00:26:06.791 "num_base_bdevs_discovered": 3, 00:26:06.791 "num_base_bdevs_operational": 3, 00:26:06.791 "base_bdevs_list": [ 00:26:06.791 { 00:26:06.791 "name": null, 00:26:06.791 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:06.791 "is_configured": false, 00:26:06.791 "data_offset": 0, 00:26:06.791 "data_size": 65536 00:26:06.791 }, 00:26:06.791 { 00:26:06.791 "name": "BaseBdev2", 00:26:06.791 "uuid": "89af74bb-6de0-59f0-81c0-4b44343a2157", 00:26:06.791 "is_configured": true, 00:26:06.791 "data_offset": 0, 00:26:06.791 "data_size": 65536 00:26:06.791 }, 00:26:06.791 { 00:26:06.791 "name": "BaseBdev3", 00:26:06.791 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:06.791 "is_configured": true, 00:26:06.791 "data_offset": 0, 00:26:06.791 "data_size": 65536 00:26:06.791 }, 00:26:06.791 { 00:26:06.791 "name": "BaseBdev4", 00:26:06.791 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:06.791 "is_configured": true, 00:26:06.791 "data_offset": 0, 00:26:06.791 "data_size": 65536 00:26:06.791 } 00:26:06.791 ] 00:26:06.791 }' 00:26:06.791 00:20:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:06.791 00:20:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:07.359 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:07.359 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.359 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:07.359 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:07.359 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.359 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.359 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.618 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.618 "name": "raid_bdev1", 00:26:07.618 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:07.618 "strip_size_kb": 0, 00:26:07.618 "state": "online", 00:26:07.618 "raid_level": "raid1", 00:26:07.618 "superblock": false, 00:26:07.618 "num_base_bdevs": 4, 00:26:07.618 "num_base_bdevs_discovered": 3, 00:26:07.618 "num_base_bdevs_operational": 3, 00:26:07.618 "base_bdevs_list": [ 00:26:07.618 { 00:26:07.618 "name": null, 00:26:07.618 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.618 "is_configured": false, 00:26:07.618 "data_offset": 0, 00:26:07.618 "data_size": 65536 00:26:07.618 }, 00:26:07.618 { 00:26:07.618 "name": "BaseBdev2", 00:26:07.618 "uuid": "89af74bb-6de0-59f0-81c0-4b44343a2157", 00:26:07.618 "is_configured": true, 00:26:07.618 "data_offset": 0, 00:26:07.618 "data_size": 65536 00:26:07.618 }, 00:26:07.618 { 00:26:07.618 "name": "BaseBdev3", 00:26:07.618 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:07.618 "is_configured": true, 00:26:07.618 "data_offset": 0, 00:26:07.618 "data_size": 65536 00:26:07.618 }, 00:26:07.618 { 00:26:07.618 "name": "BaseBdev4", 00:26:07.618 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:07.618 "is_configured": true, 00:26:07.618 "data_offset": 0, 00:26:07.618 "data_size": 65536 00:26:07.618 } 00:26:07.618 ] 00:26:07.618 }' 00:26:07.618 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.618 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:07.618 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.618 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:07.618 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:07.876 [2024-07-16 00:20:54.656473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:07.876 00:20:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:07.876 [2024-07-16 00:20:54.723771] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x232d270 00:26:07.876 [2024-07-16 00:20:54.725341] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:08.135 [2024-07-16 00:20:54.854504] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:08.135 [2024-07-16 00:20:54.854989] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:08.135 [2024-07-16 00:20:55.078907] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:08.135 [2024-07-16 00:20:55.079641] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:08.702 [2024-07-16 00:20:55.428136] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:08.702 [2024-07-16 00:20:55.580953] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:08.960 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:08.960 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:08.960 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:08.960 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:08.960 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:08.960 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.960 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.960 [2024-07-16 00:20:55.855670] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:09.225 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:09.225 "name": "raid_bdev1", 00:26:09.225 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:09.225 "strip_size_kb": 0, 00:26:09.225 "state": "online", 00:26:09.225 "raid_level": "raid1", 00:26:09.225 "superblock": false, 00:26:09.225 "num_base_bdevs": 4, 00:26:09.225 "num_base_bdevs_discovered": 4, 00:26:09.225 "num_base_bdevs_operational": 4, 00:26:09.225 "process": { 00:26:09.225 "type": "rebuild", 00:26:09.225 "target": "spare", 00:26:09.225 "progress": { 00:26:09.225 "blocks": 14336, 00:26:09.225 "percent": 21 00:26:09.225 } 00:26:09.225 }, 00:26:09.225 "base_bdevs_list": [ 00:26:09.225 { 00:26:09.225 "name": "spare", 00:26:09.225 "uuid": "eb8195f5-c6bb-5eb1-8c42-1f0a12186199", 00:26:09.225 "is_configured": true, 00:26:09.225 "data_offset": 0, 00:26:09.225 "data_size": 65536 00:26:09.225 }, 00:26:09.225 { 00:26:09.225 "name": "BaseBdev2", 00:26:09.225 "uuid": "89af74bb-6de0-59f0-81c0-4b44343a2157", 00:26:09.225 "is_configured": true, 00:26:09.225 "data_offset": 0, 00:26:09.225 "data_size": 65536 00:26:09.225 }, 00:26:09.225 { 00:26:09.225 "name": "BaseBdev3", 00:26:09.225 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:09.225 "is_configured": true, 00:26:09.225 "data_offset": 0, 00:26:09.225 "data_size": 65536 00:26:09.225 }, 00:26:09.225 { 00:26:09.225 "name": "BaseBdev4", 00:26:09.225 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:09.225 "is_configured": true, 00:26:09.225 "data_offset": 0, 00:26:09.225 "data_size": 65536 00:26:09.225 } 00:26:09.225 ] 00:26:09.225 }' 00:26:09.225 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:09.225 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:09.225 00:20:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:09.225 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:09.225 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:26:09.225 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:09.225 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:09.225 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:09.225 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:09.484 [2024-07-16 00:20:56.238839] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:09.484 [2024-07-16 00:20:56.250230] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:09.484 [2024-07-16 00:20:56.267508] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2330970 00:26:09.484 [2024-07-16 00:20:56.267533] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x232d270 00:26:09.484 [2024-07-16 00:20:56.269031] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:09.484 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:09.484 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:09.484 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:09.484 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:09.484 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:09.484 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:09.484 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:09.484 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.484 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:09.743 [2024-07-16 00:20:56.517287] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:09.743 "name": "raid_bdev1", 00:26:09.743 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:09.743 "strip_size_kb": 0, 00:26:09.743 "state": "online", 00:26:09.743 "raid_level": "raid1", 00:26:09.743 "superblock": false, 00:26:09.743 "num_base_bdevs": 4, 00:26:09.743 "num_base_bdevs_discovered": 3, 00:26:09.743 "num_base_bdevs_operational": 3, 00:26:09.743 "process": { 00:26:09.743 "type": "rebuild", 00:26:09.743 "target": "spare", 00:26:09.743 "progress": { 00:26:09.743 "blocks": 22528, 00:26:09.743 "percent": 34 00:26:09.743 } 00:26:09.743 }, 00:26:09.743 "base_bdevs_list": [ 00:26:09.743 { 00:26:09.743 "name": "spare", 00:26:09.743 "uuid": "eb8195f5-c6bb-5eb1-8c42-1f0a12186199", 00:26:09.743 "is_configured": true, 00:26:09.743 "data_offset": 0, 00:26:09.743 "data_size": 65536 00:26:09.743 }, 00:26:09.743 { 00:26:09.743 "name": null, 00:26:09.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:09.743 "is_configured": false, 00:26:09.743 "data_offset": 0, 00:26:09.743 "data_size": 65536 00:26:09.743 }, 00:26:09.743 { 00:26:09.743 "name": "BaseBdev3", 00:26:09.743 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:09.743 "is_configured": true, 00:26:09.743 "data_offset": 0, 00:26:09.743 "data_size": 65536 00:26:09.743 }, 00:26:09.743 { 00:26:09.743 "name": "BaseBdev4", 00:26:09.743 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:09.743 "is_configured": true, 00:26:09.743 "data_offset": 0, 00:26:09.743 "data_size": 65536 00:26:09.743 } 00:26:09.743 ] 00:26:09.743 }' 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=966 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.743 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.001 [2024-07-16 00:20:56.861360] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:10.001 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:10.001 "name": "raid_bdev1", 00:26:10.001 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:10.001 "strip_size_kb": 0, 00:26:10.001 "state": "online", 00:26:10.001 "raid_level": "raid1", 00:26:10.001 "superblock": false, 00:26:10.001 "num_base_bdevs": 4, 00:26:10.001 "num_base_bdevs_discovered": 3, 00:26:10.001 "num_base_bdevs_operational": 3, 00:26:10.001 "process": { 00:26:10.001 "type": "rebuild", 00:26:10.001 "target": "spare", 00:26:10.001 "progress": { 00:26:10.001 "blocks": 26624, 00:26:10.001 "percent": 40 00:26:10.001 } 00:26:10.001 }, 00:26:10.001 "base_bdevs_list": [ 00:26:10.001 { 00:26:10.001 "name": "spare", 00:26:10.001 "uuid": "eb8195f5-c6bb-5eb1-8c42-1f0a12186199", 00:26:10.001 "is_configured": true, 00:26:10.001 "data_offset": 0, 00:26:10.001 "data_size": 65536 00:26:10.001 }, 00:26:10.001 { 00:26:10.001 "name": null, 00:26:10.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.001 "is_configured": false, 00:26:10.001 "data_offset": 0, 00:26:10.001 "data_size": 65536 00:26:10.001 }, 00:26:10.001 { 00:26:10.001 "name": "BaseBdev3", 00:26:10.001 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:10.001 "is_configured": true, 00:26:10.001 "data_offset": 0, 00:26:10.001 "data_size": 65536 00:26:10.001 }, 00:26:10.001 { 00:26:10.001 "name": "BaseBdev4", 00:26:10.001 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:10.001 "is_configured": true, 00:26:10.001 "data_offset": 0, 00:26:10.001 "data_size": 65536 00:26:10.001 } 00:26:10.001 ] 00:26:10.001 }' 00:26:10.002 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:10.260 [2024-07-16 00:20:56.972300] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:10.260 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:10.260 00:20:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:10.260 00:20:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:10.260 00:20:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:10.519 [2024-07-16 00:20:57.224895] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:10.519 [2024-07-16 00:20:57.225234] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:10.519 [2024-07-16 00:20:57.364737] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:11.086 [2024-07-16 00:20:57.837111] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:26:11.343 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:11.343 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:11.343 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:11.343 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:11.343 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:11.343 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:11.343 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.343 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.600 [2024-07-16 00:20:58.524775] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:11.857 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.857 "name": "raid_bdev1", 00:26:11.857 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:11.857 "strip_size_kb": 0, 00:26:11.857 "state": "online", 00:26:11.857 "raid_level": "raid1", 00:26:11.857 "superblock": false, 00:26:11.857 "num_base_bdevs": 4, 00:26:11.857 "num_base_bdevs_discovered": 3, 00:26:11.857 "num_base_bdevs_operational": 3, 00:26:11.857 "process": { 00:26:11.857 "type": "rebuild", 00:26:11.857 "target": "spare", 00:26:11.857 "progress": { 00:26:11.857 "blocks": 51200, 00:26:11.857 "percent": 78 00:26:11.857 } 00:26:11.857 }, 00:26:11.857 "base_bdevs_list": [ 00:26:11.857 { 00:26:11.857 "name": "spare", 00:26:11.857 "uuid": "eb8195f5-c6bb-5eb1-8c42-1f0a12186199", 00:26:11.857 "is_configured": true, 00:26:11.857 "data_offset": 0, 00:26:11.857 "data_size": 65536 00:26:11.857 }, 00:26:11.857 { 00:26:11.857 "name": null, 00:26:11.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.857 "is_configured": false, 00:26:11.857 "data_offset": 0, 00:26:11.857 "data_size": 65536 00:26:11.857 }, 00:26:11.857 { 00:26:11.857 "name": "BaseBdev3", 00:26:11.857 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:11.857 "is_configured": true, 00:26:11.857 "data_offset": 0, 00:26:11.857 "data_size": 65536 00:26:11.857 }, 00:26:11.857 { 00:26:11.857 "name": "BaseBdev4", 00:26:11.857 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:11.857 "is_configured": true, 00:26:11.857 "data_offset": 0, 00:26:11.857 "data_size": 65536 00:26:11.857 } 00:26:11.857 ] 00:26:11.857 }' 00:26:11.857 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:11.857 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:11.857 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:11.857 [2024-07-16 00:20:58.646497] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:11.857 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:11.857 00:20:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:12.115 [2024-07-16 00:20:58.988599] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:26:12.373 [2024-07-16 00:20:59.090459] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:26:12.373 [2024-07-16 00:20:59.090618] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:26:12.631 [2024-07-16 00:20:59.455667] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:12.631 [2024-07-16 00:20:59.563954] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:12.631 [2024-07-16 00:20:59.567009] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:12.889 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:12.889 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:12.889 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:12.889 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:12.889 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:12.889 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:12.889 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.889 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.167 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:13.167 "name": "raid_bdev1", 00:26:13.167 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:13.167 "strip_size_kb": 0, 00:26:13.167 "state": "online", 00:26:13.167 "raid_level": "raid1", 00:26:13.167 "superblock": false, 00:26:13.167 "num_base_bdevs": 4, 00:26:13.167 "num_base_bdevs_discovered": 3, 00:26:13.167 "num_base_bdevs_operational": 3, 00:26:13.167 "base_bdevs_list": [ 00:26:13.167 { 00:26:13.167 "name": "spare", 00:26:13.167 "uuid": "eb8195f5-c6bb-5eb1-8c42-1f0a12186199", 00:26:13.167 "is_configured": true, 00:26:13.167 "data_offset": 0, 00:26:13.167 "data_size": 65536 00:26:13.167 }, 00:26:13.167 { 00:26:13.167 "name": null, 00:26:13.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.167 "is_configured": false, 00:26:13.167 "data_offset": 0, 00:26:13.167 "data_size": 65536 00:26:13.167 }, 00:26:13.167 { 00:26:13.167 "name": "BaseBdev3", 00:26:13.167 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:13.167 "is_configured": true, 00:26:13.167 "data_offset": 0, 00:26:13.167 "data_size": 65536 00:26:13.167 }, 00:26:13.167 { 00:26:13.167 "name": "BaseBdev4", 00:26:13.167 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:13.167 "is_configured": true, 00:26:13.167 "data_offset": 0, 00:26:13.167 "data_size": 65536 00:26:13.167 } 00:26:13.167 ] 00:26:13.167 }' 00:26:13.167 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:13.167 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:13.167 00:20:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:13.167 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:13.167 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:26:13.167 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:13.167 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:13.167 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:13.167 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:13.167 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:13.167 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.167 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.425 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:13.425 "name": "raid_bdev1", 00:26:13.425 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:13.425 "strip_size_kb": 0, 00:26:13.426 "state": "online", 00:26:13.426 "raid_level": "raid1", 00:26:13.426 "superblock": false, 00:26:13.426 "num_base_bdevs": 4, 00:26:13.426 "num_base_bdevs_discovered": 3, 00:26:13.426 "num_base_bdevs_operational": 3, 00:26:13.426 "base_bdevs_list": [ 00:26:13.426 { 00:26:13.426 "name": "spare", 00:26:13.426 "uuid": "eb8195f5-c6bb-5eb1-8c42-1f0a12186199", 00:26:13.426 "is_configured": true, 00:26:13.426 "data_offset": 0, 00:26:13.426 "data_size": 65536 00:26:13.426 }, 00:26:13.426 { 00:26:13.426 "name": null, 00:26:13.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.426 "is_configured": false, 00:26:13.426 "data_offset": 0, 00:26:13.426 "data_size": 65536 00:26:13.426 }, 00:26:13.426 { 00:26:13.426 "name": "BaseBdev3", 00:26:13.426 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:13.426 "is_configured": true, 00:26:13.426 "data_offset": 0, 00:26:13.426 "data_size": 65536 00:26:13.426 }, 00:26:13.426 { 00:26:13.426 "name": "BaseBdev4", 00:26:13.426 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:13.426 "is_configured": true, 00:26:13.426 "data_offset": 0, 00:26:13.426 "data_size": 65536 00:26:13.426 } 00:26:13.426 ] 00:26:13.426 }' 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.426 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.684 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:13.684 "name": "raid_bdev1", 00:26:13.684 "uuid": "1614a660-664d-4f01-bf51-46428c6caaf8", 00:26:13.684 "strip_size_kb": 0, 00:26:13.684 "state": "online", 00:26:13.684 "raid_level": "raid1", 00:26:13.684 "superblock": false, 00:26:13.684 "num_base_bdevs": 4, 00:26:13.684 "num_base_bdevs_discovered": 3, 00:26:13.684 "num_base_bdevs_operational": 3, 00:26:13.684 "base_bdevs_list": [ 00:26:13.684 { 00:26:13.684 "name": "spare", 00:26:13.684 "uuid": "eb8195f5-c6bb-5eb1-8c42-1f0a12186199", 00:26:13.684 "is_configured": true, 00:26:13.684 "data_offset": 0, 00:26:13.684 "data_size": 65536 00:26:13.684 }, 00:26:13.684 { 00:26:13.684 "name": null, 00:26:13.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.684 "is_configured": false, 00:26:13.684 "data_offset": 0, 00:26:13.684 "data_size": 65536 00:26:13.684 }, 00:26:13.684 { 00:26:13.684 "name": "BaseBdev3", 00:26:13.684 "uuid": "6aa53a52-f8b6-51b6-aabe-e9571cf33ab8", 00:26:13.684 "is_configured": true, 00:26:13.684 "data_offset": 0, 00:26:13.684 "data_size": 65536 00:26:13.684 }, 00:26:13.684 { 00:26:13.684 "name": "BaseBdev4", 00:26:13.684 "uuid": "f081b60c-50a5-5c67-a3dc-431e32bcb10b", 00:26:13.684 "is_configured": true, 00:26:13.684 "data_offset": 0, 00:26:13.684 "data_size": 65536 00:26:13.684 } 00:26:13.684 ] 00:26:13.684 }' 00:26:13.684 00:21:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:13.684 00:21:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:14.619 00:21:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:14.878 [2024-07-16 00:21:01.730737] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:14.878 [2024-07-16 00:21:01.730768] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:14.878 00:26:14.878 Latency(us) 00:26:14.878 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:14.878 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:14.878 raid_bdev1 : 12.05 94.14 282.41 0.00 0.00 14369.40 299.19 119446.48 00:26:14.878 =================================================================================================================== 00:26:14.878 Total : 94.14 282.41 0.00 0.00 14369.40 299.19 119446.48 00:26:14.878 [2024-07-16 00:21:01.782887] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:14.878 [2024-07-16 00:21:01.782915] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:14.878 [2024-07-16 00:21:01.783012] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:14.878 [2024-07-16 00:21:01.783024] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x232a8a0 name raid_bdev1, state offline 00:26:14.878 0 00:26:14.878 00:21:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.878 00:21:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:15.137 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:15.396 /dev/nbd0 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:15.396 1+0 records in 00:26:15.396 1+0 records out 00:26:15.396 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000301193 s, 13.6 MB/s 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:15.396 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:15.654 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:15.654 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:15.654 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:15.654 /dev/nbd1 00:26:15.654 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:15.912 1+0 records in 00:26:15.912 1+0 records out 00:26:15.912 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000301815 s, 13.6 MB/s 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:15.912 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:16.171 00:21:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:16.429 /dev/nbd1 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:16.429 1+0 records in 00:26:16.429 1+0 records out 00:26:16.429 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297336 s, 13.8 MB/s 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:16.429 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:16.687 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 3622548 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 3622548 ']' 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 3622548 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3622548 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3622548' 00:26:17.253 killing process with pid 3622548 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 3622548 00:26:17.253 Received shutdown signal, test time was about 14.228872 seconds 00:26:17.253 00:26:17.253 Latency(us) 00:26:17.253 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:17.253 =================================================================================================================== 00:26:17.253 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:17.253 [2024-07-16 00:21:03.966716] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:17.253 00:21:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 3622548 00:26:17.253 [2024-07-16 00:21:04.011661] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:17.511 00:21:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:17.511 00:26:17.511 real 0m21.498s 00:26:17.511 user 0m34.247s 00:26:17.511 sys 0m3.782s 00:26:17.511 00:21:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:17.511 00:21:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:17.511 ************************************ 00:26:17.511 END TEST raid_rebuild_test_io 00:26:17.511 ************************************ 00:26:17.511 00:21:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:17.511 00:21:04 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:26:17.511 00:21:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:17.511 00:21:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:17.511 00:21:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:17.512 ************************************ 00:26:17.512 START TEST raid_rebuild_test_sb_io 00:26:17.512 ************************************ 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=3625724 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 3625724 /var/tmp/spdk-raid.sock 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 3625724 ']' 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:17.512 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:17.512 00:21:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:17.512 [2024-07-16 00:21:04.410802] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:26:17.512 [2024-07-16 00:21:04.410872] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3625724 ] 00:26:17.512 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:17.512 Zero copy mechanism will not be used. 00:26:17.770 [2024-07-16 00:21:04.541437] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:17.770 [2024-07-16 00:21:04.653086] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:17.770 [2024-07-16 00:21:04.715947] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:17.770 [2024-07-16 00:21:04.715978] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:18.704 00:21:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:18.704 00:21:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:26:18.704 00:21:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:18.704 00:21:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:19.272 BaseBdev1_malloc 00:26:19.272 00:21:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:19.530 [2024-07-16 00:21:06.359329] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:19.530 [2024-07-16 00:21:06.359378] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:19.530 [2024-07-16 00:21:06.359403] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c58d40 00:26:19.530 [2024-07-16 00:21:06.359416] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:19.530 [2024-07-16 00:21:06.361153] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:19.530 [2024-07-16 00:21:06.361182] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:19.530 BaseBdev1 00:26:19.530 00:21:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:19.530 00:21:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:19.795 BaseBdev2_malloc 00:26:19.795 00:21:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:20.124 [2024-07-16 00:21:06.853526] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:20.124 [2024-07-16 00:21:06.853571] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:20.124 [2024-07-16 00:21:06.853597] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c59860 00:26:20.124 [2024-07-16 00:21:06.853610] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:20.124 [2024-07-16 00:21:06.855152] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:20.124 [2024-07-16 00:21:06.855179] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:20.124 BaseBdev2 00:26:20.124 00:21:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:20.124 00:21:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:20.383 BaseBdev3_malloc 00:26:20.383 00:21:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:20.641 [2024-07-16 00:21:07.347446] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:20.641 [2024-07-16 00:21:07.347494] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:20.641 [2024-07-16 00:21:07.347517] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e068f0 00:26:20.641 [2024-07-16 00:21:07.347530] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:20.641 [2024-07-16 00:21:07.349128] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:20.641 [2024-07-16 00:21:07.349156] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:20.641 BaseBdev3 00:26:20.641 00:21:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:20.641 00:21:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:20.900 BaseBdev4_malloc 00:26:20.900 00:21:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:20.900 [2024-07-16 00:21:07.846643] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:20.900 [2024-07-16 00:21:07.846691] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:20.900 [2024-07-16 00:21:07.846713] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e05ad0 00:26:20.900 [2024-07-16 00:21:07.846726] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:20.900 [2024-07-16 00:21:07.848333] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:20.900 [2024-07-16 00:21:07.848361] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:20.900 BaseBdev4 00:26:21.159 00:21:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:21.159 spare_malloc 00:26:21.417 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:21.417 spare_delay 00:26:21.417 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:21.675 [2024-07-16 00:21:08.574383] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:21.675 [2024-07-16 00:21:08.574430] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:21.675 [2024-07-16 00:21:08.574453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0a5b0 00:26:21.675 [2024-07-16 00:21:08.574465] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:21.675 [2024-07-16 00:21:08.576068] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:21.675 [2024-07-16 00:21:08.576096] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:21.675 spare 00:26:21.675 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:21.934 [2024-07-16 00:21:08.807035] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:21.934 [2024-07-16 00:21:08.808355] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:21.934 [2024-07-16 00:21:08.808409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:21.934 [2024-07-16 00:21:08.808455] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:21.934 [2024-07-16 00:21:08.808655] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d898a0 00:26:21.934 [2024-07-16 00:21:08.808667] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:21.934 [2024-07-16 00:21:08.808872] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e03e10 00:26:21.934 [2024-07-16 00:21:08.809033] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d898a0 00:26:21.934 [2024-07-16 00:21:08.809044] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d898a0 00:26:21.934 [2024-07-16 00:21:08.809142] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.934 00:21:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.503 00:21:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:22.503 "name": "raid_bdev1", 00:26:22.503 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:22.503 "strip_size_kb": 0, 00:26:22.503 "state": "online", 00:26:22.503 "raid_level": "raid1", 00:26:22.503 "superblock": true, 00:26:22.503 "num_base_bdevs": 4, 00:26:22.503 "num_base_bdevs_discovered": 4, 00:26:22.503 "num_base_bdevs_operational": 4, 00:26:22.503 "base_bdevs_list": [ 00:26:22.503 { 00:26:22.503 "name": "BaseBdev1", 00:26:22.503 "uuid": "3c79271d-b72a-56fd-b7e8-b130b7089fb8", 00:26:22.503 "is_configured": true, 00:26:22.503 "data_offset": 2048, 00:26:22.503 "data_size": 63488 00:26:22.503 }, 00:26:22.503 { 00:26:22.503 "name": "BaseBdev2", 00:26:22.503 "uuid": "be625fab-2117-5204-ac35-9e625195da60", 00:26:22.503 "is_configured": true, 00:26:22.503 "data_offset": 2048, 00:26:22.503 "data_size": 63488 00:26:22.503 }, 00:26:22.503 { 00:26:22.503 "name": "BaseBdev3", 00:26:22.503 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:22.503 "is_configured": true, 00:26:22.503 "data_offset": 2048, 00:26:22.503 "data_size": 63488 00:26:22.503 }, 00:26:22.503 { 00:26:22.503 "name": "BaseBdev4", 00:26:22.503 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:22.503 "is_configured": true, 00:26:22.503 "data_offset": 2048, 00:26:22.503 "data_size": 63488 00:26:22.503 } 00:26:22.503 ] 00:26:22.503 }' 00:26:22.503 00:21:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:22.503 00:21:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:23.440 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:23.440 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:23.699 [2024-07-16 00:21:10.403548] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:23.699 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:26:23.699 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.699 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:23.958 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:26:23.958 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:23.958 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:23.958 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:23.958 [2024-07-16 00:21:10.794417] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c58670 00:26:23.958 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:23.958 Zero copy mechanism will not be used. 00:26:23.958 Running I/O for 60 seconds... 00:26:24.217 [2024-07-16 00:21:10.916884] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:24.217 [2024-07-16 00:21:10.935933] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1c58670 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.217 00:21:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.476 00:21:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:24.476 "name": "raid_bdev1", 00:26:24.476 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:24.476 "strip_size_kb": 0, 00:26:24.476 "state": "online", 00:26:24.476 "raid_level": "raid1", 00:26:24.476 "superblock": true, 00:26:24.476 "num_base_bdevs": 4, 00:26:24.476 "num_base_bdevs_discovered": 3, 00:26:24.476 "num_base_bdevs_operational": 3, 00:26:24.476 "base_bdevs_list": [ 00:26:24.476 { 00:26:24.476 "name": null, 00:26:24.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.476 "is_configured": false, 00:26:24.476 "data_offset": 2048, 00:26:24.476 "data_size": 63488 00:26:24.476 }, 00:26:24.476 { 00:26:24.476 "name": "BaseBdev2", 00:26:24.476 "uuid": "be625fab-2117-5204-ac35-9e625195da60", 00:26:24.476 "is_configured": true, 00:26:24.476 "data_offset": 2048, 00:26:24.476 "data_size": 63488 00:26:24.476 }, 00:26:24.476 { 00:26:24.476 "name": "BaseBdev3", 00:26:24.476 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:24.476 "is_configured": true, 00:26:24.476 "data_offset": 2048, 00:26:24.476 "data_size": 63488 00:26:24.476 }, 00:26:24.476 { 00:26:24.476 "name": "BaseBdev4", 00:26:24.476 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:24.476 "is_configured": true, 00:26:24.476 "data_offset": 2048, 00:26:24.476 "data_size": 63488 00:26:24.476 } 00:26:24.476 ] 00:26:24.476 }' 00:26:24.476 00:21:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:24.476 00:21:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:25.042 00:21:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:25.301 [2024-07-16 00:21:12.050312] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:25.301 00:21:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:25.301 [2024-07-16 00:21:12.102562] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d8bba0 00:26:25.301 [2024-07-16 00:21:12.104998] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:25.301 [2024-07-16 00:21:12.231822] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:25.301 [2024-07-16 00:21:12.233097] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:25.559 [2024-07-16 00:21:12.443464] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:25.559 [2024-07-16 00:21:12.443609] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:26.126 [2024-07-16 00:21:12.815180] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:26.126 [2024-07-16 00:21:13.058358] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:26.385 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:26.385 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:26.385 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:26.385 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:26.385 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:26.385 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.385 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.643 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:26.643 "name": "raid_bdev1", 00:26:26.643 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:26.643 "strip_size_kb": 0, 00:26:26.643 "state": "online", 00:26:26.643 "raid_level": "raid1", 00:26:26.643 "superblock": true, 00:26:26.643 "num_base_bdevs": 4, 00:26:26.643 "num_base_bdevs_discovered": 4, 00:26:26.643 "num_base_bdevs_operational": 4, 00:26:26.643 "process": { 00:26:26.643 "type": "rebuild", 00:26:26.643 "target": "spare", 00:26:26.643 "progress": { 00:26:26.643 "blocks": 10240, 00:26:26.643 "percent": 16 00:26:26.643 } 00:26:26.643 }, 00:26:26.643 "base_bdevs_list": [ 00:26:26.643 { 00:26:26.643 "name": "spare", 00:26:26.643 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:26.643 "is_configured": true, 00:26:26.643 "data_offset": 2048, 00:26:26.643 "data_size": 63488 00:26:26.643 }, 00:26:26.643 { 00:26:26.643 "name": "BaseBdev2", 00:26:26.643 "uuid": "be625fab-2117-5204-ac35-9e625195da60", 00:26:26.643 "is_configured": true, 00:26:26.643 "data_offset": 2048, 00:26:26.643 "data_size": 63488 00:26:26.643 }, 00:26:26.643 { 00:26:26.643 "name": "BaseBdev3", 00:26:26.643 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:26.643 "is_configured": true, 00:26:26.643 "data_offset": 2048, 00:26:26.643 "data_size": 63488 00:26:26.643 }, 00:26:26.643 { 00:26:26.643 "name": "BaseBdev4", 00:26:26.643 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:26.643 "is_configured": true, 00:26:26.643 "data_offset": 2048, 00:26:26.643 "data_size": 63488 00:26:26.643 } 00:26:26.643 ] 00:26:26.643 }' 00:26:26.643 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:26.643 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:26.643 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:26.643 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:26.643 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:26.643 [2024-07-16 00:21:13.505896] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:26.902 [2024-07-16 00:21:13.668233] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:26.902 [2024-07-16 00:21:13.739245] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:26.902 [2024-07-16 00:21:13.756516] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:26.902 [2024-07-16 00:21:13.771950] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:26.902 [2024-07-16 00:21:13.771983] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:26.902 [2024-07-16 00:21:13.771994] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:26.902 [2024-07-16 00:21:13.800889] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1c58670 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.902 00:21:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.161 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:27.161 "name": "raid_bdev1", 00:26:27.161 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:27.161 "strip_size_kb": 0, 00:26:27.161 "state": "online", 00:26:27.161 "raid_level": "raid1", 00:26:27.161 "superblock": true, 00:26:27.161 "num_base_bdevs": 4, 00:26:27.161 "num_base_bdevs_discovered": 3, 00:26:27.161 "num_base_bdevs_operational": 3, 00:26:27.161 "base_bdevs_list": [ 00:26:27.161 { 00:26:27.161 "name": null, 00:26:27.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.161 "is_configured": false, 00:26:27.161 "data_offset": 2048, 00:26:27.161 "data_size": 63488 00:26:27.161 }, 00:26:27.161 { 00:26:27.161 "name": "BaseBdev2", 00:26:27.162 "uuid": "be625fab-2117-5204-ac35-9e625195da60", 00:26:27.162 "is_configured": true, 00:26:27.162 "data_offset": 2048, 00:26:27.162 "data_size": 63488 00:26:27.162 }, 00:26:27.162 { 00:26:27.162 "name": "BaseBdev3", 00:26:27.162 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:27.162 "is_configured": true, 00:26:27.162 "data_offset": 2048, 00:26:27.162 "data_size": 63488 00:26:27.162 }, 00:26:27.162 { 00:26:27.162 "name": "BaseBdev4", 00:26:27.162 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:27.162 "is_configured": true, 00:26:27.162 "data_offset": 2048, 00:26:27.162 "data_size": 63488 00:26:27.162 } 00:26:27.162 ] 00:26:27.162 }' 00:26:27.162 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:27.162 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:28.100 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:28.100 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:28.100 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:28.100 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:28.100 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:28.100 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.100 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:28.100 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:28.100 "name": "raid_bdev1", 00:26:28.100 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:28.100 "strip_size_kb": 0, 00:26:28.100 "state": "online", 00:26:28.100 "raid_level": "raid1", 00:26:28.100 "superblock": true, 00:26:28.100 "num_base_bdevs": 4, 00:26:28.100 "num_base_bdevs_discovered": 3, 00:26:28.100 "num_base_bdevs_operational": 3, 00:26:28.100 "base_bdevs_list": [ 00:26:28.100 { 00:26:28.100 "name": null, 00:26:28.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:28.100 "is_configured": false, 00:26:28.100 "data_offset": 2048, 00:26:28.100 "data_size": 63488 00:26:28.100 }, 00:26:28.100 { 00:26:28.100 "name": "BaseBdev2", 00:26:28.100 "uuid": "be625fab-2117-5204-ac35-9e625195da60", 00:26:28.100 "is_configured": true, 00:26:28.100 "data_offset": 2048, 00:26:28.100 "data_size": 63488 00:26:28.100 }, 00:26:28.100 { 00:26:28.100 "name": "BaseBdev3", 00:26:28.100 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:28.100 "is_configured": true, 00:26:28.100 "data_offset": 2048, 00:26:28.100 "data_size": 63488 00:26:28.100 }, 00:26:28.100 { 00:26:28.100 "name": "BaseBdev4", 00:26:28.100 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:28.100 "is_configured": true, 00:26:28.100 "data_offset": 2048, 00:26:28.100 "data_size": 63488 00:26:28.100 } 00:26:28.100 ] 00:26:28.100 }' 00:26:28.100 00:21:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:28.100 00:21:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:28.100 00:21:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:28.359 00:21:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:28.359 00:21:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:28.359 [2024-07-16 00:21:15.299237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:28.619 00:21:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:28.619 [2024-07-16 00:21:15.371711] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dfead0 00:26:28.619 [2024-07-16 00:21:15.373233] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:28.619 [2024-07-16 00:21:15.484306] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:28.619 [2024-07-16 00:21:15.484715] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:28.878 [2024-07-16 00:21:15.607216] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:28.878 [2024-07-16 00:21:15.607787] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:29.446 [2024-07-16 00:21:16.341086] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:29.446 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:29.446 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:29.446 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:29.446 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:29.446 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:29.446 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.446 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.706 [2024-07-16 00:21:16.577243] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:29.706 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:29.706 "name": "raid_bdev1", 00:26:29.706 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:29.706 "strip_size_kb": 0, 00:26:29.706 "state": "online", 00:26:29.706 "raid_level": "raid1", 00:26:29.706 "superblock": true, 00:26:29.706 "num_base_bdevs": 4, 00:26:29.706 "num_base_bdevs_discovered": 4, 00:26:29.706 "num_base_bdevs_operational": 4, 00:26:29.706 "process": { 00:26:29.706 "type": "rebuild", 00:26:29.706 "target": "spare", 00:26:29.707 "progress": { 00:26:29.707 "blocks": 16384, 00:26:29.707 "percent": 25 00:26:29.707 } 00:26:29.707 }, 00:26:29.707 "base_bdevs_list": [ 00:26:29.707 { 00:26:29.707 "name": "spare", 00:26:29.707 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:29.707 "is_configured": true, 00:26:29.707 "data_offset": 2048, 00:26:29.707 "data_size": 63488 00:26:29.707 }, 00:26:29.707 { 00:26:29.707 "name": "BaseBdev2", 00:26:29.707 "uuid": "be625fab-2117-5204-ac35-9e625195da60", 00:26:29.707 "is_configured": true, 00:26:29.707 "data_offset": 2048, 00:26:29.707 "data_size": 63488 00:26:29.707 }, 00:26:29.707 { 00:26:29.707 "name": "BaseBdev3", 00:26:29.707 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:29.707 "is_configured": true, 00:26:29.707 "data_offset": 2048, 00:26:29.707 "data_size": 63488 00:26:29.707 }, 00:26:29.707 { 00:26:29.707 "name": "BaseBdev4", 00:26:29.707 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:29.707 "is_configured": true, 00:26:29.707 "data_offset": 2048, 00:26:29.707 "data_size": 63488 00:26:29.707 } 00:26:29.707 ] 00:26:29.707 }' 00:26:29.707 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:29.966 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:29.966 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:29.966 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:29.966 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:29.966 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:29.966 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:29.966 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:29.966 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:29.966 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:29.966 00:21:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:29.966 [2024-07-16 00:21:16.837055] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:29.966 [2024-07-16 00:21:16.883604] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:30.535 [2024-07-16 00:21:17.232832] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1c58670 00:26:30.535 [2024-07-16 00:21:17.232866] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1dfead0 00:26:30.535 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:30.535 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:30.536 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:30.536 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:30.536 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:30.536 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:30.536 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:30.536 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.536 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.103 [2024-07-16 00:21:17.797804] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:31.103 "name": "raid_bdev1", 00:26:31.103 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:31.103 "strip_size_kb": 0, 00:26:31.103 "state": "online", 00:26:31.103 "raid_level": "raid1", 00:26:31.103 "superblock": true, 00:26:31.103 "num_base_bdevs": 4, 00:26:31.103 "num_base_bdevs_discovered": 3, 00:26:31.103 "num_base_bdevs_operational": 3, 00:26:31.103 "process": { 00:26:31.103 "type": "rebuild", 00:26:31.103 "target": "spare", 00:26:31.103 "progress": { 00:26:31.103 "blocks": 30720, 00:26:31.103 "percent": 48 00:26:31.103 } 00:26:31.103 }, 00:26:31.103 "base_bdevs_list": [ 00:26:31.103 { 00:26:31.103 "name": "spare", 00:26:31.103 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:31.103 "is_configured": true, 00:26:31.103 "data_offset": 2048, 00:26:31.103 "data_size": 63488 00:26:31.103 }, 00:26:31.103 { 00:26:31.103 "name": null, 00:26:31.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:31.103 "is_configured": false, 00:26:31.103 "data_offset": 2048, 00:26:31.103 "data_size": 63488 00:26:31.103 }, 00:26:31.103 { 00:26:31.103 "name": "BaseBdev3", 00:26:31.103 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:31.103 "is_configured": true, 00:26:31.103 "data_offset": 2048, 00:26:31.103 "data_size": 63488 00:26:31.103 }, 00:26:31.103 { 00:26:31.103 "name": "BaseBdev4", 00:26:31.103 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:31.103 "is_configured": true, 00:26:31.103 "data_offset": 2048, 00:26:31.103 "data_size": 63488 00:26:31.103 } 00:26:31.103 ] 00:26:31.103 }' 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=987 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:31.103 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.104 00:21:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.362 00:21:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:31.362 "name": "raid_bdev1", 00:26:31.362 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:31.362 "strip_size_kb": 0, 00:26:31.362 "state": "online", 00:26:31.362 "raid_level": "raid1", 00:26:31.362 "superblock": true, 00:26:31.362 "num_base_bdevs": 4, 00:26:31.362 "num_base_bdevs_discovered": 3, 00:26:31.362 "num_base_bdevs_operational": 3, 00:26:31.362 "process": { 00:26:31.362 "type": "rebuild", 00:26:31.362 "target": "spare", 00:26:31.362 "progress": { 00:26:31.362 "blocks": 36864, 00:26:31.362 "percent": 58 00:26:31.362 } 00:26:31.362 }, 00:26:31.362 "base_bdevs_list": [ 00:26:31.362 { 00:26:31.362 "name": "spare", 00:26:31.362 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:31.362 "is_configured": true, 00:26:31.362 "data_offset": 2048, 00:26:31.362 "data_size": 63488 00:26:31.362 }, 00:26:31.362 { 00:26:31.362 "name": null, 00:26:31.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:31.362 "is_configured": false, 00:26:31.362 "data_offset": 2048, 00:26:31.362 "data_size": 63488 00:26:31.362 }, 00:26:31.362 { 00:26:31.362 "name": "BaseBdev3", 00:26:31.362 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:31.362 "is_configured": true, 00:26:31.362 "data_offset": 2048, 00:26:31.362 "data_size": 63488 00:26:31.362 }, 00:26:31.362 { 00:26:31.362 "name": "BaseBdev4", 00:26:31.362 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:31.362 "is_configured": true, 00:26:31.362 "data_offset": 2048, 00:26:31.362 "data_size": 63488 00:26:31.362 } 00:26:31.362 ] 00:26:31.362 }' 00:26:31.362 00:21:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:31.362 00:21:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:31.362 00:21:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:31.362 [2024-07-16 00:21:18.151827] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:31.362 [2024-07-16 00:21:18.152265] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:31.362 00:21:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:31.362 00:21:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:32.298 [2024-07-16 00:21:18.987260] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:32.298 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:32.298 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:32.298 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:32.298 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:32.298 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:32.298 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:32.298 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.298 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:32.558 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:32.558 "name": "raid_bdev1", 00:26:32.558 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:32.558 "strip_size_kb": 0, 00:26:32.558 "state": "online", 00:26:32.558 "raid_level": "raid1", 00:26:32.558 "superblock": true, 00:26:32.558 "num_base_bdevs": 4, 00:26:32.558 "num_base_bdevs_discovered": 3, 00:26:32.558 "num_base_bdevs_operational": 3, 00:26:32.558 "process": { 00:26:32.558 "type": "rebuild", 00:26:32.558 "target": "spare", 00:26:32.558 "progress": { 00:26:32.558 "blocks": 55296, 00:26:32.558 "percent": 87 00:26:32.558 } 00:26:32.558 }, 00:26:32.558 "base_bdevs_list": [ 00:26:32.558 { 00:26:32.558 "name": "spare", 00:26:32.558 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:32.558 "is_configured": true, 00:26:32.558 "data_offset": 2048, 00:26:32.558 "data_size": 63488 00:26:32.558 }, 00:26:32.558 { 00:26:32.558 "name": null, 00:26:32.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:32.558 "is_configured": false, 00:26:32.558 "data_offset": 2048, 00:26:32.558 "data_size": 63488 00:26:32.558 }, 00:26:32.558 { 00:26:32.558 "name": "BaseBdev3", 00:26:32.558 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:32.558 "is_configured": true, 00:26:32.558 "data_offset": 2048, 00:26:32.558 "data_size": 63488 00:26:32.558 }, 00:26:32.558 { 00:26:32.558 "name": "BaseBdev4", 00:26:32.558 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:32.558 "is_configured": true, 00:26:32.558 "data_offset": 2048, 00:26:32.558 "data_size": 63488 00:26:32.558 } 00:26:32.558 ] 00:26:32.558 }' 00:26:32.558 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:32.558 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:32.558 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:32.558 [2024-07-16 00:21:19.448872] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:26:32.558 [2024-07-16 00:21:19.449115] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:26:32.558 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:32.558 00:21:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:33.126 [2024-07-16 00:21:19.783085] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:33.126 [2024-07-16 00:21:19.891359] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:33.126 [2024-07-16 00:21:19.893743] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:33.693 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:33.693 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:33.693 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:33.693 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:33.693 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:33.693 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:33.693 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.693 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:33.952 "name": "raid_bdev1", 00:26:33.952 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:33.952 "strip_size_kb": 0, 00:26:33.952 "state": "online", 00:26:33.952 "raid_level": "raid1", 00:26:33.952 "superblock": true, 00:26:33.952 "num_base_bdevs": 4, 00:26:33.952 "num_base_bdevs_discovered": 3, 00:26:33.952 "num_base_bdevs_operational": 3, 00:26:33.952 "base_bdevs_list": [ 00:26:33.952 { 00:26:33.952 "name": "spare", 00:26:33.952 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:33.952 "is_configured": true, 00:26:33.952 "data_offset": 2048, 00:26:33.952 "data_size": 63488 00:26:33.952 }, 00:26:33.952 { 00:26:33.952 "name": null, 00:26:33.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:33.952 "is_configured": false, 00:26:33.952 "data_offset": 2048, 00:26:33.952 "data_size": 63488 00:26:33.952 }, 00:26:33.952 { 00:26:33.952 "name": "BaseBdev3", 00:26:33.952 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:33.952 "is_configured": true, 00:26:33.952 "data_offset": 2048, 00:26:33.952 "data_size": 63488 00:26:33.952 }, 00:26:33.952 { 00:26:33.952 "name": "BaseBdev4", 00:26:33.952 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:33.952 "is_configured": true, 00:26:33.952 "data_offset": 2048, 00:26:33.952 "data_size": 63488 00:26:33.952 } 00:26:33.952 ] 00:26:33.952 }' 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.952 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.211 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:34.211 "name": "raid_bdev1", 00:26:34.211 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:34.211 "strip_size_kb": 0, 00:26:34.211 "state": "online", 00:26:34.211 "raid_level": "raid1", 00:26:34.211 "superblock": true, 00:26:34.211 "num_base_bdevs": 4, 00:26:34.211 "num_base_bdevs_discovered": 3, 00:26:34.211 "num_base_bdevs_operational": 3, 00:26:34.211 "base_bdevs_list": [ 00:26:34.211 { 00:26:34.211 "name": "spare", 00:26:34.211 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:34.211 "is_configured": true, 00:26:34.211 "data_offset": 2048, 00:26:34.211 "data_size": 63488 00:26:34.211 }, 00:26:34.211 { 00:26:34.211 "name": null, 00:26:34.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:34.211 "is_configured": false, 00:26:34.211 "data_offset": 2048, 00:26:34.211 "data_size": 63488 00:26:34.211 }, 00:26:34.211 { 00:26:34.211 "name": "BaseBdev3", 00:26:34.211 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:34.211 "is_configured": true, 00:26:34.211 "data_offset": 2048, 00:26:34.211 "data_size": 63488 00:26:34.211 }, 00:26:34.211 { 00:26:34.211 "name": "BaseBdev4", 00:26:34.211 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:34.212 "is_configured": true, 00:26:34.212 "data_offset": 2048, 00:26:34.212 "data_size": 63488 00:26:34.212 } 00:26:34.212 ] 00:26:34.212 }' 00:26:34.212 00:21:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.212 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.471 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:34.471 "name": "raid_bdev1", 00:26:34.471 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:34.471 "strip_size_kb": 0, 00:26:34.471 "state": "online", 00:26:34.471 "raid_level": "raid1", 00:26:34.471 "superblock": true, 00:26:34.471 "num_base_bdevs": 4, 00:26:34.471 "num_base_bdevs_discovered": 3, 00:26:34.471 "num_base_bdevs_operational": 3, 00:26:34.471 "base_bdevs_list": [ 00:26:34.471 { 00:26:34.471 "name": "spare", 00:26:34.471 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:34.471 "is_configured": true, 00:26:34.471 "data_offset": 2048, 00:26:34.471 "data_size": 63488 00:26:34.471 }, 00:26:34.471 { 00:26:34.471 "name": null, 00:26:34.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:34.471 "is_configured": false, 00:26:34.471 "data_offset": 2048, 00:26:34.471 "data_size": 63488 00:26:34.471 }, 00:26:34.471 { 00:26:34.471 "name": "BaseBdev3", 00:26:34.471 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:34.471 "is_configured": true, 00:26:34.471 "data_offset": 2048, 00:26:34.471 "data_size": 63488 00:26:34.471 }, 00:26:34.471 { 00:26:34.471 "name": "BaseBdev4", 00:26:34.471 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:34.471 "is_configured": true, 00:26:34.471 "data_offset": 2048, 00:26:34.471 "data_size": 63488 00:26:34.471 } 00:26:34.471 ] 00:26:34.471 }' 00:26:34.471 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:34.471 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:35.037 00:21:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:35.296 [2024-07-16 00:21:22.154593] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:35.296 [2024-07-16 00:21:22.154627] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:35.296 00:26:35.296 Latency(us) 00:26:35.296 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:35.296 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:35.296 raid_bdev1 : 11.39 87.71 263.12 0.00 0.00 15948.33 290.28 127652.73 00:26:35.296 =================================================================================================================== 00:26:35.296 Total : 87.71 263.12 0.00 0.00 15948.33 290.28 127652.73 00:26:35.296 [2024-07-16 00:21:22.218690] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:35.296 [2024-07-16 00:21:22.218717] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:35.296 [2024-07-16 00:21:22.218808] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:35.296 [2024-07-16 00:21:22.218827] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d898a0 name raid_bdev1, state offline 00:26:35.296 0 00:26:35.554 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.554 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:35.813 /dev/nbd0 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:35.813 1+0 records in 00:26:35.813 1+0 records out 00:26:35.813 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000211676 s, 19.4 MB/s 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:35.813 00:21:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:36.072 /dev/nbd1 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:36.072 1+0 records in 00:26:36.072 1+0 records out 00:26:36.072 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319244 s, 12.8 MB/s 00:26:36.072 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:36.329 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:36.330 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:36.586 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:36.844 /dev/nbd1 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:36.844 1+0 records in 00:26:36.844 1+0 records out 00:26:36.844 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000216465 s, 18.9 MB/s 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:36.844 00:21:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:37.130 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:37.394 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:37.394 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:37.394 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:37.394 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:37.394 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:37.394 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:37.394 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:37.394 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:37.394 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:37.394 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:37.651 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:37.909 [2024-07-16 00:21:24.776665] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:37.909 [2024-07-16 00:21:24.776714] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:37.909 [2024-07-16 00:21:24.776736] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c57ae0 00:26:37.909 [2024-07-16 00:21:24.776749] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:37.909 [2024-07-16 00:21:24.778395] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:37.909 [2024-07-16 00:21:24.778424] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:37.909 [2024-07-16 00:21:24.778505] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:37.909 [2024-07-16 00:21:24.778530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:37.909 [2024-07-16 00:21:24.778637] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:37.909 [2024-07-16 00:21:24.778710] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:37.909 spare 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:37.909 00:21:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.167 [2024-07-16 00:21:24.879026] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d8bf90 00:26:38.167 [2024-07-16 00:21:24.879043] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:38.167 [2024-07-16 00:21:24.879236] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d89f10 00:26:38.167 [2024-07-16 00:21:24.879390] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d8bf90 00:26:38.167 [2024-07-16 00:21:24.879401] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d8bf90 00:26:38.167 [2024-07-16 00:21:24.879506] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:38.167 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:38.167 "name": "raid_bdev1", 00:26:38.167 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:38.167 "strip_size_kb": 0, 00:26:38.167 "state": "online", 00:26:38.167 "raid_level": "raid1", 00:26:38.167 "superblock": true, 00:26:38.167 "num_base_bdevs": 4, 00:26:38.167 "num_base_bdevs_discovered": 3, 00:26:38.167 "num_base_bdevs_operational": 3, 00:26:38.167 "base_bdevs_list": [ 00:26:38.167 { 00:26:38.167 "name": "spare", 00:26:38.167 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:38.167 "is_configured": true, 00:26:38.167 "data_offset": 2048, 00:26:38.167 "data_size": 63488 00:26:38.167 }, 00:26:38.167 { 00:26:38.167 "name": null, 00:26:38.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.167 "is_configured": false, 00:26:38.167 "data_offset": 2048, 00:26:38.167 "data_size": 63488 00:26:38.167 }, 00:26:38.167 { 00:26:38.167 "name": "BaseBdev3", 00:26:38.167 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:38.167 "is_configured": true, 00:26:38.167 "data_offset": 2048, 00:26:38.167 "data_size": 63488 00:26:38.167 }, 00:26:38.167 { 00:26:38.167 "name": "BaseBdev4", 00:26:38.167 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:38.167 "is_configured": true, 00:26:38.167 "data_offset": 2048, 00:26:38.167 "data_size": 63488 00:26:38.167 } 00:26:38.167 ] 00:26:38.167 }' 00:26:38.167 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:38.167 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:38.733 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:38.733 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:38.733 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:38.733 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:38.733 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:38.733 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.733 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.991 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:38.991 "name": "raid_bdev1", 00:26:38.991 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:38.991 "strip_size_kb": 0, 00:26:38.991 "state": "online", 00:26:38.991 "raid_level": "raid1", 00:26:38.991 "superblock": true, 00:26:38.991 "num_base_bdevs": 4, 00:26:38.991 "num_base_bdevs_discovered": 3, 00:26:38.991 "num_base_bdevs_operational": 3, 00:26:38.991 "base_bdevs_list": [ 00:26:38.991 { 00:26:38.991 "name": "spare", 00:26:38.991 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:38.991 "is_configured": true, 00:26:38.991 "data_offset": 2048, 00:26:38.991 "data_size": 63488 00:26:38.991 }, 00:26:38.991 { 00:26:38.991 "name": null, 00:26:38.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.991 "is_configured": false, 00:26:38.991 "data_offset": 2048, 00:26:38.991 "data_size": 63488 00:26:38.991 }, 00:26:38.991 { 00:26:38.991 "name": "BaseBdev3", 00:26:38.991 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:38.991 "is_configured": true, 00:26:38.991 "data_offset": 2048, 00:26:38.991 "data_size": 63488 00:26:38.991 }, 00:26:38.991 { 00:26:38.991 "name": "BaseBdev4", 00:26:38.991 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:38.991 "is_configured": true, 00:26:38.991 "data_offset": 2048, 00:26:38.991 "data_size": 63488 00:26:38.991 } 00:26:38.991 ] 00:26:38.991 }' 00:26:38.991 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:39.249 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:39.249 00:21:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:39.249 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:39.249 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.249 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:39.506 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:39.506 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:39.764 [2024-07-16 00:21:26.477525] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.764 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.022 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:40.022 "name": "raid_bdev1", 00:26:40.022 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:40.022 "strip_size_kb": 0, 00:26:40.022 "state": "online", 00:26:40.022 "raid_level": "raid1", 00:26:40.022 "superblock": true, 00:26:40.022 "num_base_bdevs": 4, 00:26:40.022 "num_base_bdevs_discovered": 2, 00:26:40.022 "num_base_bdevs_operational": 2, 00:26:40.022 "base_bdevs_list": [ 00:26:40.022 { 00:26:40.022 "name": null, 00:26:40.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.022 "is_configured": false, 00:26:40.022 "data_offset": 2048, 00:26:40.022 "data_size": 63488 00:26:40.022 }, 00:26:40.022 { 00:26:40.022 "name": null, 00:26:40.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.022 "is_configured": false, 00:26:40.022 "data_offset": 2048, 00:26:40.022 "data_size": 63488 00:26:40.022 }, 00:26:40.022 { 00:26:40.022 "name": "BaseBdev3", 00:26:40.022 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:40.022 "is_configured": true, 00:26:40.022 "data_offset": 2048, 00:26:40.022 "data_size": 63488 00:26:40.022 }, 00:26:40.022 { 00:26:40.022 "name": "BaseBdev4", 00:26:40.022 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:40.022 "is_configured": true, 00:26:40.022 "data_offset": 2048, 00:26:40.022 "data_size": 63488 00:26:40.022 } 00:26:40.022 ] 00:26:40.022 }' 00:26:40.022 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:40.022 00:21:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:40.587 00:21:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:40.845 [2024-07-16 00:21:27.572634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:40.845 [2024-07-16 00:21:27.572778] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:40.845 [2024-07-16 00:21:27.572795] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:40.845 [2024-07-16 00:21:27.572823] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:40.845 [2024-07-16 00:21:27.577256] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d8f750 00:26:40.845 [2024-07-16 00:21:27.579629] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:40.845 00:21:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:41.780 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:41.780 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:41.780 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:41.780 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:41.780 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:41.780 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.780 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.039 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:42.039 "name": "raid_bdev1", 00:26:42.039 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:42.039 "strip_size_kb": 0, 00:26:42.039 "state": "online", 00:26:42.039 "raid_level": "raid1", 00:26:42.039 "superblock": true, 00:26:42.039 "num_base_bdevs": 4, 00:26:42.039 "num_base_bdevs_discovered": 3, 00:26:42.039 "num_base_bdevs_operational": 3, 00:26:42.039 "process": { 00:26:42.039 "type": "rebuild", 00:26:42.039 "target": "spare", 00:26:42.039 "progress": { 00:26:42.039 "blocks": 24576, 00:26:42.039 "percent": 38 00:26:42.039 } 00:26:42.039 }, 00:26:42.039 "base_bdevs_list": [ 00:26:42.039 { 00:26:42.039 "name": "spare", 00:26:42.039 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:42.039 "is_configured": true, 00:26:42.039 "data_offset": 2048, 00:26:42.039 "data_size": 63488 00:26:42.039 }, 00:26:42.039 { 00:26:42.039 "name": null, 00:26:42.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.039 "is_configured": false, 00:26:42.039 "data_offset": 2048, 00:26:42.039 "data_size": 63488 00:26:42.039 }, 00:26:42.039 { 00:26:42.039 "name": "BaseBdev3", 00:26:42.039 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:42.039 "is_configured": true, 00:26:42.039 "data_offset": 2048, 00:26:42.039 "data_size": 63488 00:26:42.039 }, 00:26:42.039 { 00:26:42.039 "name": "BaseBdev4", 00:26:42.039 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:42.039 "is_configured": true, 00:26:42.039 "data_offset": 2048, 00:26:42.039 "data_size": 63488 00:26:42.039 } 00:26:42.039 ] 00:26:42.039 }' 00:26:42.039 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:42.039 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:42.039 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:42.039 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:42.039 00:21:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:42.297 [2024-07-16 00:21:29.166798] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:42.297 [2024-07-16 00:21:29.192151] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:42.297 [2024-07-16 00:21:29.192196] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:42.297 [2024-07-16 00:21:29.192212] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:42.297 [2024-07-16 00:21:29.192220] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:42.297 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:42.297 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:42.298 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:42.298 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:42.298 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:42.298 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:42.298 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:42.298 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:42.298 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:42.298 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:42.298 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.298 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.555 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:42.555 "name": "raid_bdev1", 00:26:42.555 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:42.555 "strip_size_kb": 0, 00:26:42.555 "state": "online", 00:26:42.556 "raid_level": "raid1", 00:26:42.556 "superblock": true, 00:26:42.556 "num_base_bdevs": 4, 00:26:42.556 "num_base_bdevs_discovered": 2, 00:26:42.556 "num_base_bdevs_operational": 2, 00:26:42.556 "base_bdevs_list": [ 00:26:42.556 { 00:26:42.556 "name": null, 00:26:42.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.556 "is_configured": false, 00:26:42.556 "data_offset": 2048, 00:26:42.556 "data_size": 63488 00:26:42.556 }, 00:26:42.556 { 00:26:42.556 "name": null, 00:26:42.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.556 "is_configured": false, 00:26:42.556 "data_offset": 2048, 00:26:42.556 "data_size": 63488 00:26:42.556 }, 00:26:42.556 { 00:26:42.556 "name": "BaseBdev3", 00:26:42.556 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:42.556 "is_configured": true, 00:26:42.556 "data_offset": 2048, 00:26:42.556 "data_size": 63488 00:26:42.556 }, 00:26:42.556 { 00:26:42.556 "name": "BaseBdev4", 00:26:42.556 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:42.556 "is_configured": true, 00:26:42.556 "data_offset": 2048, 00:26:42.556 "data_size": 63488 00:26:42.556 } 00:26:42.556 ] 00:26:42.556 }' 00:26:42.556 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:42.556 00:21:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:43.490 00:21:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:43.490 [2024-07-16 00:21:30.315488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:43.490 [2024-07-16 00:21:30.315540] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:43.490 [2024-07-16 00:21:30.315564] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e09ee0 00:26:43.490 [2024-07-16 00:21:30.315577] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:43.490 [2024-07-16 00:21:30.315963] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:43.490 [2024-07-16 00:21:30.315982] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:43.490 [2024-07-16 00:21:30.316067] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:43.490 [2024-07-16 00:21:30.316078] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:43.490 [2024-07-16 00:21:30.316090] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:43.490 [2024-07-16 00:21:30.316109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:43.490 [2024-07-16 00:21:30.320576] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e01930 00:26:43.490 spare 00:26:43.490 [2024-07-16 00:21:30.322009] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:43.490 00:21:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:44.424 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:44.424 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:44.424 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:44.424 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:44.424 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:44.424 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.424 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.682 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:44.682 "name": "raid_bdev1", 00:26:44.682 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:44.682 "strip_size_kb": 0, 00:26:44.682 "state": "online", 00:26:44.682 "raid_level": "raid1", 00:26:44.682 "superblock": true, 00:26:44.682 "num_base_bdevs": 4, 00:26:44.682 "num_base_bdevs_discovered": 3, 00:26:44.682 "num_base_bdevs_operational": 3, 00:26:44.682 "process": { 00:26:44.682 "type": "rebuild", 00:26:44.682 "target": "spare", 00:26:44.682 "progress": { 00:26:44.682 "blocks": 24576, 00:26:44.682 "percent": 38 00:26:44.682 } 00:26:44.682 }, 00:26:44.682 "base_bdevs_list": [ 00:26:44.682 { 00:26:44.682 "name": "spare", 00:26:44.682 "uuid": "2bca1816-9a20-57ac-bbb1-01704255e840", 00:26:44.682 "is_configured": true, 00:26:44.682 "data_offset": 2048, 00:26:44.682 "data_size": 63488 00:26:44.682 }, 00:26:44.682 { 00:26:44.682 "name": null, 00:26:44.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:44.682 "is_configured": false, 00:26:44.682 "data_offset": 2048, 00:26:44.682 "data_size": 63488 00:26:44.682 }, 00:26:44.682 { 00:26:44.682 "name": "BaseBdev3", 00:26:44.682 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:44.682 "is_configured": true, 00:26:44.682 "data_offset": 2048, 00:26:44.682 "data_size": 63488 00:26:44.682 }, 00:26:44.682 { 00:26:44.682 "name": "BaseBdev4", 00:26:44.682 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:44.682 "is_configured": true, 00:26:44.682 "data_offset": 2048, 00:26:44.682 "data_size": 63488 00:26:44.682 } 00:26:44.682 ] 00:26:44.682 }' 00:26:44.682 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:44.940 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:44.940 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:44.940 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:44.941 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:45.199 [2024-07-16 00:21:31.906317] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:45.199 [2024-07-16 00:21:31.934328] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:45.199 [2024-07-16 00:21:31.934373] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:45.199 [2024-07-16 00:21:31.934390] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:45.199 [2024-07-16 00:21:31.934398] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.199 00:21:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.457 00:21:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:45.457 "name": "raid_bdev1", 00:26:45.457 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:45.457 "strip_size_kb": 0, 00:26:45.457 "state": "online", 00:26:45.457 "raid_level": "raid1", 00:26:45.457 "superblock": true, 00:26:45.457 "num_base_bdevs": 4, 00:26:45.457 "num_base_bdevs_discovered": 2, 00:26:45.457 "num_base_bdevs_operational": 2, 00:26:45.457 "base_bdevs_list": [ 00:26:45.457 { 00:26:45.457 "name": null, 00:26:45.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.457 "is_configured": false, 00:26:45.457 "data_offset": 2048, 00:26:45.457 "data_size": 63488 00:26:45.457 }, 00:26:45.457 { 00:26:45.457 "name": null, 00:26:45.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.457 "is_configured": false, 00:26:45.457 "data_offset": 2048, 00:26:45.457 "data_size": 63488 00:26:45.457 }, 00:26:45.457 { 00:26:45.457 "name": "BaseBdev3", 00:26:45.457 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:45.457 "is_configured": true, 00:26:45.457 "data_offset": 2048, 00:26:45.457 "data_size": 63488 00:26:45.457 }, 00:26:45.457 { 00:26:45.457 "name": "BaseBdev4", 00:26:45.457 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:45.457 "is_configured": true, 00:26:45.457 "data_offset": 2048, 00:26:45.457 "data_size": 63488 00:26:45.457 } 00:26:45.457 ] 00:26:45.457 }' 00:26:45.457 00:21:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:45.457 00:21:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:46.024 00:21:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:46.024 00:21:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:46.024 00:21:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:46.024 00:21:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:46.024 00:21:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:46.024 00:21:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.024 00:21:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.282 00:21:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:46.282 "name": "raid_bdev1", 00:26:46.282 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:46.282 "strip_size_kb": 0, 00:26:46.282 "state": "online", 00:26:46.282 "raid_level": "raid1", 00:26:46.282 "superblock": true, 00:26:46.282 "num_base_bdevs": 4, 00:26:46.282 "num_base_bdevs_discovered": 2, 00:26:46.282 "num_base_bdevs_operational": 2, 00:26:46.282 "base_bdevs_list": [ 00:26:46.282 { 00:26:46.282 "name": null, 00:26:46.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.282 "is_configured": false, 00:26:46.282 "data_offset": 2048, 00:26:46.282 "data_size": 63488 00:26:46.282 }, 00:26:46.282 { 00:26:46.282 "name": null, 00:26:46.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.282 "is_configured": false, 00:26:46.282 "data_offset": 2048, 00:26:46.282 "data_size": 63488 00:26:46.282 }, 00:26:46.282 { 00:26:46.282 "name": "BaseBdev3", 00:26:46.282 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:46.282 "is_configured": true, 00:26:46.282 "data_offset": 2048, 00:26:46.282 "data_size": 63488 00:26:46.282 }, 00:26:46.282 { 00:26:46.282 "name": "BaseBdev4", 00:26:46.282 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:46.282 "is_configured": true, 00:26:46.282 "data_offset": 2048, 00:26:46.282 "data_size": 63488 00:26:46.282 } 00:26:46.282 ] 00:26:46.282 }' 00:26:46.282 00:21:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:46.282 00:21:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:46.282 00:21:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:46.282 00:21:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:46.282 00:21:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:46.540 00:21:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:46.798 [2024-07-16 00:21:33.664031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:46.798 [2024-07-16 00:21:33.664078] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:46.798 [2024-07-16 00:21:33.664098] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d89cc0 00:26:46.798 [2024-07-16 00:21:33.664111] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:46.798 [2024-07-16 00:21:33.664454] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:46.798 [2024-07-16 00:21:33.664472] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:46.798 [2024-07-16 00:21:33.664535] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:46.798 [2024-07-16 00:21:33.664547] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:46.798 [2024-07-16 00:21:33.664558] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:46.798 BaseBdev1 00:26:46.798 00:21:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:48.173 "name": "raid_bdev1", 00:26:48.173 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:48.173 "strip_size_kb": 0, 00:26:48.173 "state": "online", 00:26:48.173 "raid_level": "raid1", 00:26:48.173 "superblock": true, 00:26:48.173 "num_base_bdevs": 4, 00:26:48.173 "num_base_bdevs_discovered": 2, 00:26:48.173 "num_base_bdevs_operational": 2, 00:26:48.173 "base_bdevs_list": [ 00:26:48.173 { 00:26:48.173 "name": null, 00:26:48.173 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.173 "is_configured": false, 00:26:48.173 "data_offset": 2048, 00:26:48.173 "data_size": 63488 00:26:48.173 }, 00:26:48.173 { 00:26:48.173 "name": null, 00:26:48.173 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.173 "is_configured": false, 00:26:48.173 "data_offset": 2048, 00:26:48.173 "data_size": 63488 00:26:48.173 }, 00:26:48.173 { 00:26:48.173 "name": "BaseBdev3", 00:26:48.173 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:48.173 "is_configured": true, 00:26:48.173 "data_offset": 2048, 00:26:48.173 "data_size": 63488 00:26:48.173 }, 00:26:48.173 { 00:26:48.173 "name": "BaseBdev4", 00:26:48.173 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:48.173 "is_configured": true, 00:26:48.173 "data_offset": 2048, 00:26:48.173 "data_size": 63488 00:26:48.173 } 00:26:48.173 ] 00:26:48.173 }' 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:48.173 00:21:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:48.739 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:48.739 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:48.739 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:48.739 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:48.739 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:48.739 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.739 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:48.997 "name": "raid_bdev1", 00:26:48.997 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:48.997 "strip_size_kb": 0, 00:26:48.997 "state": "online", 00:26:48.997 "raid_level": "raid1", 00:26:48.997 "superblock": true, 00:26:48.997 "num_base_bdevs": 4, 00:26:48.997 "num_base_bdevs_discovered": 2, 00:26:48.997 "num_base_bdevs_operational": 2, 00:26:48.997 "base_bdevs_list": [ 00:26:48.997 { 00:26:48.997 "name": null, 00:26:48.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.997 "is_configured": false, 00:26:48.997 "data_offset": 2048, 00:26:48.997 "data_size": 63488 00:26:48.997 }, 00:26:48.997 { 00:26:48.997 "name": null, 00:26:48.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.997 "is_configured": false, 00:26:48.997 "data_offset": 2048, 00:26:48.997 "data_size": 63488 00:26:48.997 }, 00:26:48.997 { 00:26:48.997 "name": "BaseBdev3", 00:26:48.997 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:48.997 "is_configured": true, 00:26:48.997 "data_offset": 2048, 00:26:48.997 "data_size": 63488 00:26:48.997 }, 00:26:48.997 { 00:26:48.997 "name": "BaseBdev4", 00:26:48.997 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:48.997 "is_configured": true, 00:26:48.997 "data_offset": 2048, 00:26:48.997 "data_size": 63488 00:26:48.997 } 00:26:48.997 ] 00:26:48.997 }' 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:48.997 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:49.255 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:49.255 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:49.255 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:49.255 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:49.255 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:49.255 00:21:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:49.255 [2024-07-16 00:21:36.106828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:49.255 [2024-07-16 00:21:36.106954] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:49.255 [2024-07-16 00:21:36.106970] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:49.255 request: 00:26:49.255 { 00:26:49.255 "base_bdev": "BaseBdev1", 00:26:49.255 "raid_bdev": "raid_bdev1", 00:26:49.255 "method": "bdev_raid_add_base_bdev", 00:26:49.255 "req_id": 1 00:26:49.255 } 00:26:49.255 Got JSON-RPC error response 00:26:49.255 response: 00:26:49.255 { 00:26:49.255 "code": -22, 00:26:49.255 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:49.255 } 00:26:49.255 00:21:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:26:49.255 00:21:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:49.256 00:21:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:49.256 00:21:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:49.256 00:21:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:50.631 "name": "raid_bdev1", 00:26:50.631 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:50.631 "strip_size_kb": 0, 00:26:50.631 "state": "online", 00:26:50.631 "raid_level": "raid1", 00:26:50.631 "superblock": true, 00:26:50.631 "num_base_bdevs": 4, 00:26:50.631 "num_base_bdevs_discovered": 2, 00:26:50.631 "num_base_bdevs_operational": 2, 00:26:50.631 "base_bdevs_list": [ 00:26:50.631 { 00:26:50.631 "name": null, 00:26:50.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.631 "is_configured": false, 00:26:50.631 "data_offset": 2048, 00:26:50.631 "data_size": 63488 00:26:50.631 }, 00:26:50.631 { 00:26:50.631 "name": null, 00:26:50.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.631 "is_configured": false, 00:26:50.631 "data_offset": 2048, 00:26:50.631 "data_size": 63488 00:26:50.631 }, 00:26:50.631 { 00:26:50.631 "name": "BaseBdev3", 00:26:50.631 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:50.631 "is_configured": true, 00:26:50.631 "data_offset": 2048, 00:26:50.631 "data_size": 63488 00:26:50.631 }, 00:26:50.631 { 00:26:50.631 "name": "BaseBdev4", 00:26:50.631 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:50.631 "is_configured": true, 00:26:50.631 "data_offset": 2048, 00:26:50.631 "data_size": 63488 00:26:50.631 } 00:26:50.631 ] 00:26:50.631 }' 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:50.631 00:21:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:51.198 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:51.198 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:51.198 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:51.198 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:51.198 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:51.198 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.198 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:51.456 "name": "raid_bdev1", 00:26:51.456 "uuid": "1fb8bc07-207f-472b-be4f-fd2471b1f3da", 00:26:51.456 "strip_size_kb": 0, 00:26:51.456 "state": "online", 00:26:51.456 "raid_level": "raid1", 00:26:51.456 "superblock": true, 00:26:51.456 "num_base_bdevs": 4, 00:26:51.456 "num_base_bdevs_discovered": 2, 00:26:51.456 "num_base_bdevs_operational": 2, 00:26:51.456 "base_bdevs_list": [ 00:26:51.456 { 00:26:51.456 "name": null, 00:26:51.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:51.456 "is_configured": false, 00:26:51.456 "data_offset": 2048, 00:26:51.456 "data_size": 63488 00:26:51.456 }, 00:26:51.456 { 00:26:51.456 "name": null, 00:26:51.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:51.456 "is_configured": false, 00:26:51.456 "data_offset": 2048, 00:26:51.456 "data_size": 63488 00:26:51.456 }, 00:26:51.456 { 00:26:51.456 "name": "BaseBdev3", 00:26:51.456 "uuid": "18722745-fc27-5257-94eb-59f9790f8f6a", 00:26:51.456 "is_configured": true, 00:26:51.456 "data_offset": 2048, 00:26:51.456 "data_size": 63488 00:26:51.456 }, 00:26:51.456 { 00:26:51.456 "name": "BaseBdev4", 00:26:51.456 "uuid": "a14fce5c-0c59-55f4-8260-99b53076d543", 00:26:51.456 "is_configured": true, 00:26:51.456 "data_offset": 2048, 00:26:51.456 "data_size": 63488 00:26:51.456 } 00:26:51.456 ] 00:26:51.456 }' 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 3625724 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 3625724 ']' 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 3625724 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3625724 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3625724' 00:26:51.456 killing process with pid 3625724 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 3625724 00:26:51.456 Received shutdown signal, test time was about 27.537433 seconds 00:26:51.456 00:26:51.456 Latency(us) 00:26:51.456 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:51.456 =================================================================================================================== 00:26:51.456 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:51.456 [2024-07-16 00:21:38.401451] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:51.456 [2024-07-16 00:21:38.401547] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:51.456 [2024-07-16 00:21:38.401609] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:51.456 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 3625724 00:26:51.456 [2024-07-16 00:21:38.401623] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d8bf90 name raid_bdev1, state offline 00:26:51.715 [2024-07-16 00:21:38.443579] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:51.979 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:51.979 00:26:51.979 real 0m34.332s 00:26:51.979 user 0m54.344s 00:26:51.979 sys 0m5.468s 00:26:51.979 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:51.979 00:21:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:51.979 ************************************ 00:26:51.979 END TEST raid_rebuild_test_sb_io 00:26:51.979 ************************************ 00:26:51.979 00:21:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:51.979 00:21:38 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:26:51.979 00:21:38 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:26:51.979 00:21:38 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:26:51.979 00:21:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:51.979 00:21:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:51.979 00:21:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:51.979 ************************************ 00:26:51.979 START TEST raid_state_function_test_sb_4k 00:26:51.979 ************************************ 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=3630910 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3630910' 00:26:51.979 Process raid pid: 3630910 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 3630910 /var/tmp/spdk-raid.sock 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 3630910 ']' 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:51.979 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:51.979 00:21:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:51.979 [2024-07-16 00:21:38.837907] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:26:51.979 [2024-07-16 00:21:38.837990] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:52.243 [2024-07-16 00:21:38.968387] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:52.243 [2024-07-16 00:21:39.069617] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:52.243 [2024-07-16 00:21:39.134300] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:52.243 [2024-07-16 00:21:39.134339] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:52.809 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:52.809 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:52.809 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:53.067 [2024-07-16 00:21:39.949106] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:53.067 [2024-07-16 00:21:39.949152] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:53.067 [2024-07-16 00:21:39.949163] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:53.067 [2024-07-16 00:21:39.949175] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.067 00:21:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:53.325 00:21:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:53.325 "name": "Existed_Raid", 00:26:53.325 "uuid": "a7f99dd8-27fb-4739-b8a4-1f9863ed11f0", 00:26:53.325 "strip_size_kb": 0, 00:26:53.325 "state": "configuring", 00:26:53.325 "raid_level": "raid1", 00:26:53.325 "superblock": true, 00:26:53.325 "num_base_bdevs": 2, 00:26:53.325 "num_base_bdevs_discovered": 0, 00:26:53.325 "num_base_bdevs_operational": 2, 00:26:53.325 "base_bdevs_list": [ 00:26:53.325 { 00:26:53.325 "name": "BaseBdev1", 00:26:53.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:53.325 "is_configured": false, 00:26:53.325 "data_offset": 0, 00:26:53.325 "data_size": 0 00:26:53.325 }, 00:26:53.325 { 00:26:53.325 "name": "BaseBdev2", 00:26:53.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:53.325 "is_configured": false, 00:26:53.325 "data_offset": 0, 00:26:53.325 "data_size": 0 00:26:53.325 } 00:26:53.325 ] 00:26:53.325 }' 00:26:53.325 00:21:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:53.325 00:21:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:53.890 00:21:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:54.148 [2024-07-16 00:21:41.063949] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:54.148 [2024-07-16 00:21:41.063985] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d52a80 name Existed_Raid, state configuring 00:26:54.148 00:21:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:54.405 [2024-07-16 00:21:41.312608] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:54.405 [2024-07-16 00:21:41.312636] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:54.405 [2024-07-16 00:21:41.312646] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:54.405 [2024-07-16 00:21:41.312658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:54.405 00:21:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:26:54.666 [2024-07-16 00:21:41.563165] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:54.666 BaseBdev1 00:26:54.666 00:21:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:54.666 00:21:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:54.666 00:21:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:54.666 00:21:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:54.666 00:21:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:54.666 00:21:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:54.666 00:21:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:54.958 00:21:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:55.217 [ 00:26:55.217 { 00:26:55.217 "name": "BaseBdev1", 00:26:55.217 "aliases": [ 00:26:55.217 "2ed352f0-2068-41dd-99c0-b968c5feeb1c" 00:26:55.217 ], 00:26:55.217 "product_name": "Malloc disk", 00:26:55.217 "block_size": 4096, 00:26:55.217 "num_blocks": 8192, 00:26:55.217 "uuid": "2ed352f0-2068-41dd-99c0-b968c5feeb1c", 00:26:55.217 "assigned_rate_limits": { 00:26:55.217 "rw_ios_per_sec": 0, 00:26:55.217 "rw_mbytes_per_sec": 0, 00:26:55.217 "r_mbytes_per_sec": 0, 00:26:55.217 "w_mbytes_per_sec": 0 00:26:55.217 }, 00:26:55.217 "claimed": true, 00:26:55.217 "claim_type": "exclusive_write", 00:26:55.217 "zoned": false, 00:26:55.217 "supported_io_types": { 00:26:55.217 "read": true, 00:26:55.217 "write": true, 00:26:55.217 "unmap": true, 00:26:55.217 "flush": true, 00:26:55.217 "reset": true, 00:26:55.217 "nvme_admin": false, 00:26:55.217 "nvme_io": false, 00:26:55.217 "nvme_io_md": false, 00:26:55.217 "write_zeroes": true, 00:26:55.217 "zcopy": true, 00:26:55.217 "get_zone_info": false, 00:26:55.217 "zone_management": false, 00:26:55.217 "zone_append": false, 00:26:55.217 "compare": false, 00:26:55.217 "compare_and_write": false, 00:26:55.217 "abort": true, 00:26:55.217 "seek_hole": false, 00:26:55.217 "seek_data": false, 00:26:55.217 "copy": true, 00:26:55.217 "nvme_iov_md": false 00:26:55.217 }, 00:26:55.217 "memory_domains": [ 00:26:55.217 { 00:26:55.217 "dma_device_id": "system", 00:26:55.217 "dma_device_type": 1 00:26:55.217 }, 00:26:55.217 { 00:26:55.217 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:55.217 "dma_device_type": 2 00:26:55.217 } 00:26:55.217 ], 00:26:55.217 "driver_specific": {} 00:26:55.217 } 00:26:55.217 ] 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.217 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:55.475 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:55.475 "name": "Existed_Raid", 00:26:55.475 "uuid": "3d0b7abd-ff56-4df3-8cc5-0c7a23cce53a", 00:26:55.475 "strip_size_kb": 0, 00:26:55.475 "state": "configuring", 00:26:55.475 "raid_level": "raid1", 00:26:55.475 "superblock": true, 00:26:55.475 "num_base_bdevs": 2, 00:26:55.475 "num_base_bdevs_discovered": 1, 00:26:55.475 "num_base_bdevs_operational": 2, 00:26:55.475 "base_bdevs_list": [ 00:26:55.475 { 00:26:55.475 "name": "BaseBdev1", 00:26:55.475 "uuid": "2ed352f0-2068-41dd-99c0-b968c5feeb1c", 00:26:55.475 "is_configured": true, 00:26:55.475 "data_offset": 256, 00:26:55.475 "data_size": 7936 00:26:55.475 }, 00:26:55.475 { 00:26:55.475 "name": "BaseBdev2", 00:26:55.475 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.475 "is_configured": false, 00:26:55.475 "data_offset": 0, 00:26:55.475 "data_size": 0 00:26:55.475 } 00:26:55.475 ] 00:26:55.475 }' 00:26:55.475 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:55.475 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:56.041 00:21:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:56.298 [2024-07-16 00:21:43.167422] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:56.298 [2024-07-16 00:21:43.167466] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d52350 name Existed_Raid, state configuring 00:26:56.298 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:56.556 [2024-07-16 00:21:43.416116] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:56.556 [2024-07-16 00:21:43.417624] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:56.556 [2024-07-16 00:21:43.417657] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.556 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:56.815 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:56.815 "name": "Existed_Raid", 00:26:56.815 "uuid": "ddd70b26-8c3d-46b6-a6aa-d10f2f89c07b", 00:26:56.815 "strip_size_kb": 0, 00:26:56.815 "state": "configuring", 00:26:56.815 "raid_level": "raid1", 00:26:56.815 "superblock": true, 00:26:56.815 "num_base_bdevs": 2, 00:26:56.815 "num_base_bdevs_discovered": 1, 00:26:56.815 "num_base_bdevs_operational": 2, 00:26:56.815 "base_bdevs_list": [ 00:26:56.815 { 00:26:56.815 "name": "BaseBdev1", 00:26:56.815 "uuid": "2ed352f0-2068-41dd-99c0-b968c5feeb1c", 00:26:56.815 "is_configured": true, 00:26:56.815 "data_offset": 256, 00:26:56.815 "data_size": 7936 00:26:56.815 }, 00:26:56.815 { 00:26:56.815 "name": "BaseBdev2", 00:26:56.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:56.815 "is_configured": false, 00:26:56.815 "data_offset": 0, 00:26:56.815 "data_size": 0 00:26:56.815 } 00:26:56.815 ] 00:26:56.815 }' 00:26:56.815 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:56.815 00:21:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:57.382 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:26:57.640 [2024-07-16 00:21:44.482323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:57.640 [2024-07-16 00:21:44.482481] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d53000 00:26:57.640 [2024-07-16 00:21:44.482496] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:57.640 [2024-07-16 00:21:44.482671] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c6d0c0 00:26:57.640 [2024-07-16 00:21:44.482791] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d53000 00:26:57.640 [2024-07-16 00:21:44.482802] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d53000 00:26:57.641 [2024-07-16 00:21:44.482894] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:57.641 BaseBdev2 00:26:57.641 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:57.641 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:57.641 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:57.641 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:57.641 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:57.641 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:57.641 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:57.899 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:57.899 [ 00:26:57.899 { 00:26:57.899 "name": "BaseBdev2", 00:26:57.899 "aliases": [ 00:26:57.899 "8ed7a5e5-f36f-4e63-a230-d702812ca53f" 00:26:57.899 ], 00:26:57.899 "product_name": "Malloc disk", 00:26:57.899 "block_size": 4096, 00:26:57.899 "num_blocks": 8192, 00:26:57.899 "uuid": "8ed7a5e5-f36f-4e63-a230-d702812ca53f", 00:26:57.899 "assigned_rate_limits": { 00:26:57.899 "rw_ios_per_sec": 0, 00:26:57.899 "rw_mbytes_per_sec": 0, 00:26:57.899 "r_mbytes_per_sec": 0, 00:26:57.899 "w_mbytes_per_sec": 0 00:26:57.899 }, 00:26:57.899 "claimed": true, 00:26:57.899 "claim_type": "exclusive_write", 00:26:57.899 "zoned": false, 00:26:57.899 "supported_io_types": { 00:26:57.899 "read": true, 00:26:57.899 "write": true, 00:26:57.899 "unmap": true, 00:26:57.899 "flush": true, 00:26:57.899 "reset": true, 00:26:57.899 "nvme_admin": false, 00:26:57.899 "nvme_io": false, 00:26:57.899 "nvme_io_md": false, 00:26:57.899 "write_zeroes": true, 00:26:57.899 "zcopy": true, 00:26:57.899 "get_zone_info": false, 00:26:57.899 "zone_management": false, 00:26:57.899 "zone_append": false, 00:26:57.899 "compare": false, 00:26:57.899 "compare_and_write": false, 00:26:57.899 "abort": true, 00:26:57.899 "seek_hole": false, 00:26:57.899 "seek_data": false, 00:26:57.899 "copy": true, 00:26:57.899 "nvme_iov_md": false 00:26:57.899 }, 00:26:57.899 "memory_domains": [ 00:26:57.899 { 00:26:57.899 "dma_device_id": "system", 00:26:57.899 "dma_device_type": 1 00:26:57.899 }, 00:26:57.899 { 00:26:57.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:57.899 "dma_device_type": 2 00:26:57.899 } 00:26:57.899 ], 00:26:57.899 "driver_specific": {} 00:26:57.899 } 00:26:57.899 ] 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.158 00:21:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:58.158 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:58.158 "name": "Existed_Raid", 00:26:58.158 "uuid": "ddd70b26-8c3d-46b6-a6aa-d10f2f89c07b", 00:26:58.158 "strip_size_kb": 0, 00:26:58.158 "state": "online", 00:26:58.158 "raid_level": "raid1", 00:26:58.158 "superblock": true, 00:26:58.158 "num_base_bdevs": 2, 00:26:58.158 "num_base_bdevs_discovered": 2, 00:26:58.158 "num_base_bdevs_operational": 2, 00:26:58.158 "base_bdevs_list": [ 00:26:58.158 { 00:26:58.158 "name": "BaseBdev1", 00:26:58.158 "uuid": "2ed352f0-2068-41dd-99c0-b968c5feeb1c", 00:26:58.158 "is_configured": true, 00:26:58.158 "data_offset": 256, 00:26:58.158 "data_size": 7936 00:26:58.158 }, 00:26:58.158 { 00:26:58.158 "name": "BaseBdev2", 00:26:58.158 "uuid": "8ed7a5e5-f36f-4e63-a230-d702812ca53f", 00:26:58.158 "is_configured": true, 00:26:58.158 "data_offset": 256, 00:26:58.158 "data_size": 7936 00:26:58.158 } 00:26:58.158 ] 00:26:58.158 }' 00:26:58.158 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:58.158 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:58.725 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:58.725 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:58.725 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:58.725 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:58.725 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:58.725 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:58.725 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:58.725 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:58.985 [2024-07-16 00:21:45.846222] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:58.985 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:58.985 "name": "Existed_Raid", 00:26:58.985 "aliases": [ 00:26:58.985 "ddd70b26-8c3d-46b6-a6aa-d10f2f89c07b" 00:26:58.985 ], 00:26:58.985 "product_name": "Raid Volume", 00:26:58.985 "block_size": 4096, 00:26:58.985 "num_blocks": 7936, 00:26:58.985 "uuid": "ddd70b26-8c3d-46b6-a6aa-d10f2f89c07b", 00:26:58.985 "assigned_rate_limits": { 00:26:58.985 "rw_ios_per_sec": 0, 00:26:58.985 "rw_mbytes_per_sec": 0, 00:26:58.985 "r_mbytes_per_sec": 0, 00:26:58.985 "w_mbytes_per_sec": 0 00:26:58.985 }, 00:26:58.985 "claimed": false, 00:26:58.985 "zoned": false, 00:26:58.985 "supported_io_types": { 00:26:58.985 "read": true, 00:26:58.985 "write": true, 00:26:58.985 "unmap": false, 00:26:58.985 "flush": false, 00:26:58.985 "reset": true, 00:26:58.985 "nvme_admin": false, 00:26:58.985 "nvme_io": false, 00:26:58.985 "nvme_io_md": false, 00:26:58.985 "write_zeroes": true, 00:26:58.985 "zcopy": false, 00:26:58.985 "get_zone_info": false, 00:26:58.985 "zone_management": false, 00:26:58.985 "zone_append": false, 00:26:58.985 "compare": false, 00:26:58.985 "compare_and_write": false, 00:26:58.985 "abort": false, 00:26:58.985 "seek_hole": false, 00:26:58.985 "seek_data": false, 00:26:58.985 "copy": false, 00:26:58.985 "nvme_iov_md": false 00:26:58.985 }, 00:26:58.985 "memory_domains": [ 00:26:58.985 { 00:26:58.985 "dma_device_id": "system", 00:26:58.985 "dma_device_type": 1 00:26:58.985 }, 00:26:58.985 { 00:26:58.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:58.985 "dma_device_type": 2 00:26:58.985 }, 00:26:58.985 { 00:26:58.985 "dma_device_id": "system", 00:26:58.985 "dma_device_type": 1 00:26:58.985 }, 00:26:58.985 { 00:26:58.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:58.985 "dma_device_type": 2 00:26:58.985 } 00:26:58.985 ], 00:26:58.985 "driver_specific": { 00:26:58.985 "raid": { 00:26:58.985 "uuid": "ddd70b26-8c3d-46b6-a6aa-d10f2f89c07b", 00:26:58.985 "strip_size_kb": 0, 00:26:58.985 "state": "online", 00:26:58.985 "raid_level": "raid1", 00:26:58.985 "superblock": true, 00:26:58.985 "num_base_bdevs": 2, 00:26:58.985 "num_base_bdevs_discovered": 2, 00:26:58.985 "num_base_bdevs_operational": 2, 00:26:58.985 "base_bdevs_list": [ 00:26:58.985 { 00:26:58.985 "name": "BaseBdev1", 00:26:58.985 "uuid": "2ed352f0-2068-41dd-99c0-b968c5feeb1c", 00:26:58.985 "is_configured": true, 00:26:58.985 "data_offset": 256, 00:26:58.985 "data_size": 7936 00:26:58.985 }, 00:26:58.985 { 00:26:58.985 "name": "BaseBdev2", 00:26:58.985 "uuid": "8ed7a5e5-f36f-4e63-a230-d702812ca53f", 00:26:58.985 "is_configured": true, 00:26:58.985 "data_offset": 256, 00:26:58.985 "data_size": 7936 00:26:58.985 } 00:26:58.985 ] 00:26:58.985 } 00:26:58.985 } 00:26:58.985 }' 00:26:58.985 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:58.985 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:58.985 BaseBdev2' 00:26:58.985 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:58.985 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:58.985 00:21:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:59.244 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:59.244 "name": "BaseBdev1", 00:26:59.244 "aliases": [ 00:26:59.244 "2ed352f0-2068-41dd-99c0-b968c5feeb1c" 00:26:59.244 ], 00:26:59.244 "product_name": "Malloc disk", 00:26:59.244 "block_size": 4096, 00:26:59.244 "num_blocks": 8192, 00:26:59.244 "uuid": "2ed352f0-2068-41dd-99c0-b968c5feeb1c", 00:26:59.244 "assigned_rate_limits": { 00:26:59.244 "rw_ios_per_sec": 0, 00:26:59.244 "rw_mbytes_per_sec": 0, 00:26:59.244 "r_mbytes_per_sec": 0, 00:26:59.244 "w_mbytes_per_sec": 0 00:26:59.244 }, 00:26:59.244 "claimed": true, 00:26:59.244 "claim_type": "exclusive_write", 00:26:59.244 "zoned": false, 00:26:59.244 "supported_io_types": { 00:26:59.244 "read": true, 00:26:59.244 "write": true, 00:26:59.244 "unmap": true, 00:26:59.244 "flush": true, 00:26:59.244 "reset": true, 00:26:59.244 "nvme_admin": false, 00:26:59.244 "nvme_io": false, 00:26:59.244 "nvme_io_md": false, 00:26:59.244 "write_zeroes": true, 00:26:59.244 "zcopy": true, 00:26:59.244 "get_zone_info": false, 00:26:59.244 "zone_management": false, 00:26:59.244 "zone_append": false, 00:26:59.244 "compare": false, 00:26:59.244 "compare_and_write": false, 00:26:59.244 "abort": true, 00:26:59.244 "seek_hole": false, 00:26:59.244 "seek_data": false, 00:26:59.244 "copy": true, 00:26:59.244 "nvme_iov_md": false 00:26:59.244 }, 00:26:59.244 "memory_domains": [ 00:26:59.244 { 00:26:59.244 "dma_device_id": "system", 00:26:59.244 "dma_device_type": 1 00:26:59.244 }, 00:26:59.244 { 00:26:59.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:59.244 "dma_device_type": 2 00:26:59.244 } 00:26:59.244 ], 00:26:59.244 "driver_specific": {} 00:26:59.244 }' 00:26:59.244 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:59.502 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:59.502 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:59.502 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:59.502 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:59.502 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:59.502 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:59.502 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:59.761 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:59.761 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:59.761 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:59.761 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:59.761 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:59.761 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:59.761 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:00.019 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:00.019 "name": "BaseBdev2", 00:27:00.019 "aliases": [ 00:27:00.019 "8ed7a5e5-f36f-4e63-a230-d702812ca53f" 00:27:00.019 ], 00:27:00.019 "product_name": "Malloc disk", 00:27:00.019 "block_size": 4096, 00:27:00.019 "num_blocks": 8192, 00:27:00.019 "uuid": "8ed7a5e5-f36f-4e63-a230-d702812ca53f", 00:27:00.019 "assigned_rate_limits": { 00:27:00.019 "rw_ios_per_sec": 0, 00:27:00.019 "rw_mbytes_per_sec": 0, 00:27:00.019 "r_mbytes_per_sec": 0, 00:27:00.020 "w_mbytes_per_sec": 0 00:27:00.020 }, 00:27:00.020 "claimed": true, 00:27:00.020 "claim_type": "exclusive_write", 00:27:00.020 "zoned": false, 00:27:00.020 "supported_io_types": { 00:27:00.020 "read": true, 00:27:00.020 "write": true, 00:27:00.020 "unmap": true, 00:27:00.020 "flush": true, 00:27:00.020 "reset": true, 00:27:00.020 "nvme_admin": false, 00:27:00.020 "nvme_io": false, 00:27:00.020 "nvme_io_md": false, 00:27:00.020 "write_zeroes": true, 00:27:00.020 "zcopy": true, 00:27:00.020 "get_zone_info": false, 00:27:00.020 "zone_management": false, 00:27:00.020 "zone_append": false, 00:27:00.020 "compare": false, 00:27:00.020 "compare_and_write": false, 00:27:00.020 "abort": true, 00:27:00.020 "seek_hole": false, 00:27:00.020 "seek_data": false, 00:27:00.020 "copy": true, 00:27:00.020 "nvme_iov_md": false 00:27:00.020 }, 00:27:00.020 "memory_domains": [ 00:27:00.020 { 00:27:00.020 "dma_device_id": "system", 00:27:00.020 "dma_device_type": 1 00:27:00.020 }, 00:27:00.020 { 00:27:00.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:00.020 "dma_device_type": 2 00:27:00.020 } 00:27:00.020 ], 00:27:00.020 "driver_specific": {} 00:27:00.020 }' 00:27:00.020 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:00.020 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:00.020 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:00.020 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:00.020 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:00.020 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:00.020 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:00.020 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:00.278 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:00.278 00:21:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:00.278 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:00.278 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:00.278 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:00.537 [2024-07-16 00:21:47.305875] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.537 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:00.796 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:00.796 "name": "Existed_Raid", 00:27:00.796 "uuid": "ddd70b26-8c3d-46b6-a6aa-d10f2f89c07b", 00:27:00.796 "strip_size_kb": 0, 00:27:00.796 "state": "online", 00:27:00.796 "raid_level": "raid1", 00:27:00.796 "superblock": true, 00:27:00.796 "num_base_bdevs": 2, 00:27:00.796 "num_base_bdevs_discovered": 1, 00:27:00.796 "num_base_bdevs_operational": 1, 00:27:00.796 "base_bdevs_list": [ 00:27:00.796 { 00:27:00.796 "name": null, 00:27:00.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:00.796 "is_configured": false, 00:27:00.796 "data_offset": 256, 00:27:00.796 "data_size": 7936 00:27:00.796 }, 00:27:00.796 { 00:27:00.796 "name": "BaseBdev2", 00:27:00.796 "uuid": "8ed7a5e5-f36f-4e63-a230-d702812ca53f", 00:27:00.796 "is_configured": true, 00:27:00.796 "data_offset": 256, 00:27:00.796 "data_size": 7936 00:27:00.796 } 00:27:00.796 ] 00:27:00.796 }' 00:27:00.796 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:00.796 00:21:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:01.364 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:01.364 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:01.364 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.364 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:01.622 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:01.622 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:01.622 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:01.881 [2024-07-16 00:21:48.674504] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:01.881 [2024-07-16 00:21:48.674590] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:01.881 [2024-07-16 00:21:48.685453] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:01.881 [2024-07-16 00:21:48.685490] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:01.881 [2024-07-16 00:21:48.685502] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d53000 name Existed_Raid, state offline 00:27:01.881 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:01.881 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:01.881 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.881 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:02.140 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:02.140 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:02.140 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:02.140 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 3630910 00:27:02.140 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 3630910 ']' 00:27:02.140 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 3630910 00:27:02.140 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:02.140 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:02.140 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3630910 00:27:02.140 00:21:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:02.140 00:21:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:02.140 00:21:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3630910' 00:27:02.140 killing process with pid 3630910 00:27:02.140 00:21:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 3630910 00:27:02.140 [2024-07-16 00:21:49.002471] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:02.140 00:21:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 3630910 00:27:02.140 [2024-07-16 00:21:49.003341] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:02.399 00:21:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:27:02.399 00:27:02.399 real 0m10.443s 00:27:02.399 user 0m18.568s 00:27:02.399 sys 0m1.977s 00:27:02.399 00:21:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:02.399 00:21:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:02.399 ************************************ 00:27:02.399 END TEST raid_state_function_test_sb_4k 00:27:02.399 ************************************ 00:27:02.399 00:21:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:02.399 00:21:49 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:27:02.399 00:21:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:02.399 00:21:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:02.399 00:21:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:02.399 ************************************ 00:27:02.399 START TEST raid_superblock_test_4k 00:27:02.399 ************************************ 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=3632530 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 3632530 /var/tmp/spdk-raid.sock 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 3632530 ']' 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:02.399 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:02.399 00:21:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:02.658 [2024-07-16 00:21:49.356429] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:27:02.658 [2024-07-16 00:21:49.356495] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3632530 ] 00:27:02.658 [2024-07-16 00:21:49.502299] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:02.917 [2024-07-16 00:21:49.638947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:02.917 [2024-07-16 00:21:49.706024] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:02.917 [2024-07-16 00:21:49.706065] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:03.486 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:27:03.744 malloc1 00:27:03.744 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:04.003 [2024-07-16 00:21:50.876768] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:04.003 [2024-07-16 00:21:50.876818] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:04.003 [2024-07-16 00:21:50.876837] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14de570 00:27:04.003 [2024-07-16 00:21:50.876850] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:04.003 [2024-07-16 00:21:50.878406] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:04.003 [2024-07-16 00:21:50.878435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:04.003 pt1 00:27:04.003 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:04.003 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:04.003 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:04.003 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:04.003 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:04.003 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:04.003 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:04.003 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:04.003 00:21:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:27:04.261 malloc2 00:27:04.261 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:04.520 [2024-07-16 00:21:51.370745] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:04.520 [2024-07-16 00:21:51.370787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:04.520 [2024-07-16 00:21:51.370803] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14df970 00:27:04.520 [2024-07-16 00:21:51.370815] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:04.520 [2024-07-16 00:21:51.372251] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:04.520 [2024-07-16 00:21:51.372278] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:04.520 pt2 00:27:04.520 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:04.520 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:04.520 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:04.779 [2024-07-16 00:21:51.615401] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:04.779 [2024-07-16 00:21:51.616548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:04.779 [2024-07-16 00:21:51.616692] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1682270 00:27:04.779 [2024-07-16 00:21:51.616705] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:04.779 [2024-07-16 00:21:51.616881] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d60e0 00:27:04.779 [2024-07-16 00:21:51.617026] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1682270 00:27:04.779 [2024-07-16 00:21:51.617036] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1682270 00:27:04.779 [2024-07-16 00:21:51.617125] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.779 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:05.038 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:05.038 "name": "raid_bdev1", 00:27:05.038 "uuid": "c15b52cd-fa83-4da4-b304-f1e353d6ea4d", 00:27:05.038 "strip_size_kb": 0, 00:27:05.038 "state": "online", 00:27:05.039 "raid_level": "raid1", 00:27:05.039 "superblock": true, 00:27:05.039 "num_base_bdevs": 2, 00:27:05.039 "num_base_bdevs_discovered": 2, 00:27:05.039 "num_base_bdevs_operational": 2, 00:27:05.039 "base_bdevs_list": [ 00:27:05.039 { 00:27:05.039 "name": "pt1", 00:27:05.039 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:05.039 "is_configured": true, 00:27:05.039 "data_offset": 256, 00:27:05.039 "data_size": 7936 00:27:05.039 }, 00:27:05.039 { 00:27:05.039 "name": "pt2", 00:27:05.039 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:05.039 "is_configured": true, 00:27:05.039 "data_offset": 256, 00:27:05.039 "data_size": 7936 00:27:05.039 } 00:27:05.039 ] 00:27:05.039 }' 00:27:05.039 00:21:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:05.039 00:21:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:05.607 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:05.607 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:05.607 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:05.607 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:05.607 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:05.607 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:05.607 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:05.607 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:05.866 [2024-07-16 00:21:52.638328] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:05.866 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:05.866 "name": "raid_bdev1", 00:27:05.866 "aliases": [ 00:27:05.866 "c15b52cd-fa83-4da4-b304-f1e353d6ea4d" 00:27:05.866 ], 00:27:05.866 "product_name": "Raid Volume", 00:27:05.866 "block_size": 4096, 00:27:05.866 "num_blocks": 7936, 00:27:05.866 "uuid": "c15b52cd-fa83-4da4-b304-f1e353d6ea4d", 00:27:05.866 "assigned_rate_limits": { 00:27:05.866 "rw_ios_per_sec": 0, 00:27:05.866 "rw_mbytes_per_sec": 0, 00:27:05.866 "r_mbytes_per_sec": 0, 00:27:05.866 "w_mbytes_per_sec": 0 00:27:05.866 }, 00:27:05.866 "claimed": false, 00:27:05.866 "zoned": false, 00:27:05.866 "supported_io_types": { 00:27:05.866 "read": true, 00:27:05.866 "write": true, 00:27:05.866 "unmap": false, 00:27:05.866 "flush": false, 00:27:05.866 "reset": true, 00:27:05.866 "nvme_admin": false, 00:27:05.866 "nvme_io": false, 00:27:05.866 "nvme_io_md": false, 00:27:05.866 "write_zeroes": true, 00:27:05.866 "zcopy": false, 00:27:05.866 "get_zone_info": false, 00:27:05.866 "zone_management": false, 00:27:05.866 "zone_append": false, 00:27:05.866 "compare": false, 00:27:05.866 "compare_and_write": false, 00:27:05.866 "abort": false, 00:27:05.866 "seek_hole": false, 00:27:05.866 "seek_data": false, 00:27:05.866 "copy": false, 00:27:05.866 "nvme_iov_md": false 00:27:05.867 }, 00:27:05.867 "memory_domains": [ 00:27:05.867 { 00:27:05.867 "dma_device_id": "system", 00:27:05.867 "dma_device_type": 1 00:27:05.867 }, 00:27:05.867 { 00:27:05.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:05.867 "dma_device_type": 2 00:27:05.867 }, 00:27:05.867 { 00:27:05.867 "dma_device_id": "system", 00:27:05.867 "dma_device_type": 1 00:27:05.867 }, 00:27:05.867 { 00:27:05.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:05.867 "dma_device_type": 2 00:27:05.867 } 00:27:05.867 ], 00:27:05.867 "driver_specific": { 00:27:05.867 "raid": { 00:27:05.867 "uuid": "c15b52cd-fa83-4da4-b304-f1e353d6ea4d", 00:27:05.867 "strip_size_kb": 0, 00:27:05.867 "state": "online", 00:27:05.867 "raid_level": "raid1", 00:27:05.867 "superblock": true, 00:27:05.867 "num_base_bdevs": 2, 00:27:05.867 "num_base_bdevs_discovered": 2, 00:27:05.867 "num_base_bdevs_operational": 2, 00:27:05.867 "base_bdevs_list": [ 00:27:05.867 { 00:27:05.867 "name": "pt1", 00:27:05.867 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:05.867 "is_configured": true, 00:27:05.867 "data_offset": 256, 00:27:05.867 "data_size": 7936 00:27:05.867 }, 00:27:05.867 { 00:27:05.867 "name": "pt2", 00:27:05.867 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:05.867 "is_configured": true, 00:27:05.867 "data_offset": 256, 00:27:05.867 "data_size": 7936 00:27:05.867 } 00:27:05.867 ] 00:27:05.867 } 00:27:05.867 } 00:27:05.867 }' 00:27:05.867 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:05.867 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:05.867 pt2' 00:27:05.867 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:05.867 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:05.867 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:06.126 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:06.126 "name": "pt1", 00:27:06.126 "aliases": [ 00:27:06.126 "00000000-0000-0000-0000-000000000001" 00:27:06.126 ], 00:27:06.126 "product_name": "passthru", 00:27:06.126 "block_size": 4096, 00:27:06.126 "num_blocks": 8192, 00:27:06.126 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:06.126 "assigned_rate_limits": { 00:27:06.126 "rw_ios_per_sec": 0, 00:27:06.126 "rw_mbytes_per_sec": 0, 00:27:06.126 "r_mbytes_per_sec": 0, 00:27:06.126 "w_mbytes_per_sec": 0 00:27:06.126 }, 00:27:06.126 "claimed": true, 00:27:06.126 "claim_type": "exclusive_write", 00:27:06.126 "zoned": false, 00:27:06.126 "supported_io_types": { 00:27:06.126 "read": true, 00:27:06.126 "write": true, 00:27:06.126 "unmap": true, 00:27:06.126 "flush": true, 00:27:06.126 "reset": true, 00:27:06.126 "nvme_admin": false, 00:27:06.126 "nvme_io": false, 00:27:06.126 "nvme_io_md": false, 00:27:06.126 "write_zeroes": true, 00:27:06.126 "zcopy": true, 00:27:06.126 "get_zone_info": false, 00:27:06.126 "zone_management": false, 00:27:06.126 "zone_append": false, 00:27:06.126 "compare": false, 00:27:06.126 "compare_and_write": false, 00:27:06.126 "abort": true, 00:27:06.126 "seek_hole": false, 00:27:06.126 "seek_data": false, 00:27:06.126 "copy": true, 00:27:06.126 "nvme_iov_md": false 00:27:06.126 }, 00:27:06.126 "memory_domains": [ 00:27:06.126 { 00:27:06.126 "dma_device_id": "system", 00:27:06.126 "dma_device_type": 1 00:27:06.126 }, 00:27:06.126 { 00:27:06.126 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:06.126 "dma_device_type": 2 00:27:06.126 } 00:27:06.126 ], 00:27:06.126 "driver_specific": { 00:27:06.126 "passthru": { 00:27:06.126 "name": "pt1", 00:27:06.126 "base_bdev_name": "malloc1" 00:27:06.126 } 00:27:06.126 } 00:27:06.126 }' 00:27:06.126 00:21:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.126 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.126 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:06.126 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.384 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.384 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:06.384 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:06.384 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:06.384 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:06.384 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:06.384 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:06.642 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:06.642 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:06.642 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:06.642 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:06.899 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:06.899 "name": "pt2", 00:27:06.899 "aliases": [ 00:27:06.899 "00000000-0000-0000-0000-000000000002" 00:27:06.899 ], 00:27:06.899 "product_name": "passthru", 00:27:06.899 "block_size": 4096, 00:27:06.899 "num_blocks": 8192, 00:27:06.899 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:06.899 "assigned_rate_limits": { 00:27:06.899 "rw_ios_per_sec": 0, 00:27:06.899 "rw_mbytes_per_sec": 0, 00:27:06.899 "r_mbytes_per_sec": 0, 00:27:06.899 "w_mbytes_per_sec": 0 00:27:06.899 }, 00:27:06.899 "claimed": true, 00:27:06.899 "claim_type": "exclusive_write", 00:27:06.899 "zoned": false, 00:27:06.899 "supported_io_types": { 00:27:06.899 "read": true, 00:27:06.899 "write": true, 00:27:06.899 "unmap": true, 00:27:06.899 "flush": true, 00:27:06.899 "reset": true, 00:27:06.899 "nvme_admin": false, 00:27:06.899 "nvme_io": false, 00:27:06.899 "nvme_io_md": false, 00:27:06.899 "write_zeroes": true, 00:27:06.899 "zcopy": true, 00:27:06.899 "get_zone_info": false, 00:27:06.899 "zone_management": false, 00:27:06.899 "zone_append": false, 00:27:06.899 "compare": false, 00:27:06.899 "compare_and_write": false, 00:27:06.899 "abort": true, 00:27:06.899 "seek_hole": false, 00:27:06.899 "seek_data": false, 00:27:06.899 "copy": true, 00:27:06.899 "nvme_iov_md": false 00:27:06.899 }, 00:27:06.899 "memory_domains": [ 00:27:06.899 { 00:27:06.899 "dma_device_id": "system", 00:27:06.899 "dma_device_type": 1 00:27:06.899 }, 00:27:06.899 { 00:27:06.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:06.899 "dma_device_type": 2 00:27:06.899 } 00:27:06.899 ], 00:27:06.899 "driver_specific": { 00:27:06.899 "passthru": { 00:27:06.899 "name": "pt2", 00:27:06.899 "base_bdev_name": "malloc2" 00:27:06.899 } 00:27:06.899 } 00:27:06.899 }' 00:27:06.899 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.899 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.899 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:06.899 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.899 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.900 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:06.900 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:06.900 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:06.900 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:06.900 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:07.158 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:07.158 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:07.158 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:07.158 00:21:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:07.158 [2024-07-16 00:21:54.082146] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:07.158 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c15b52cd-fa83-4da4-b304-f1e353d6ea4d 00:27:07.416 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z c15b52cd-fa83-4da4-b304-f1e353d6ea4d ']' 00:27:07.416 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:07.416 [2024-07-16 00:21:54.334574] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:07.416 [2024-07-16 00:21:54.334592] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:07.416 [2024-07-16 00:21:54.334641] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:07.416 [2024-07-16 00:21:54.334695] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:07.416 [2024-07-16 00:21:54.334706] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1682270 name raid_bdev1, state offline 00:27:07.674 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.674 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:07.674 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:07.674 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:07.674 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:07.674 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:07.933 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:07.933 00:21:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:08.192 00:21:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:08.192 00:21:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:08.451 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:08.711 [2024-07-16 00:21:55.585849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:08.711 [2024-07-16 00:21:55.587308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:08.711 [2024-07-16 00:21:55.587366] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:08.711 [2024-07-16 00:21:55.587409] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:08.711 [2024-07-16 00:21:55.587428] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:08.711 [2024-07-16 00:21:55.587437] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1681ff0 name raid_bdev1, state configuring 00:27:08.711 request: 00:27:08.711 { 00:27:08.711 "name": "raid_bdev1", 00:27:08.711 "raid_level": "raid1", 00:27:08.711 "base_bdevs": [ 00:27:08.711 "malloc1", 00:27:08.711 "malloc2" 00:27:08.711 ], 00:27:08.711 "superblock": false, 00:27:08.711 "method": "bdev_raid_create", 00:27:08.711 "req_id": 1 00:27:08.711 } 00:27:08.711 Got JSON-RPC error response 00:27:08.711 response: 00:27:08.711 { 00:27:08.711 "code": -17, 00:27:08.711 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:08.711 } 00:27:08.711 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:27:08.711 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:08.711 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:08.711 00:21:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:08.711 00:21:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.711 00:21:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:08.970 00:21:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:08.971 00:21:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:08.971 00:21:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:09.230 [2024-07-16 00:21:56.087104] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:09.230 [2024-07-16 00:21:56.087146] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:09.230 [2024-07-16 00:21:56.087166] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14de7a0 00:27:09.230 [2024-07-16 00:21:56.087178] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:09.230 [2024-07-16 00:21:56.088749] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:09.230 [2024-07-16 00:21:56.088776] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:09.230 [2024-07-16 00:21:56.088840] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:09.230 [2024-07-16 00:21:56.088866] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:09.230 pt1 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.230 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.490 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:09.490 "name": "raid_bdev1", 00:27:09.490 "uuid": "c15b52cd-fa83-4da4-b304-f1e353d6ea4d", 00:27:09.490 "strip_size_kb": 0, 00:27:09.490 "state": "configuring", 00:27:09.490 "raid_level": "raid1", 00:27:09.490 "superblock": true, 00:27:09.490 "num_base_bdevs": 2, 00:27:09.490 "num_base_bdevs_discovered": 1, 00:27:09.490 "num_base_bdevs_operational": 2, 00:27:09.490 "base_bdevs_list": [ 00:27:09.490 { 00:27:09.490 "name": "pt1", 00:27:09.490 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:09.490 "is_configured": true, 00:27:09.490 "data_offset": 256, 00:27:09.490 "data_size": 7936 00:27:09.490 }, 00:27:09.490 { 00:27:09.490 "name": null, 00:27:09.490 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:09.490 "is_configured": false, 00:27:09.490 "data_offset": 256, 00:27:09.490 "data_size": 7936 00:27:09.490 } 00:27:09.490 ] 00:27:09.490 }' 00:27:09.490 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:09.490 00:21:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:10.058 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:10.058 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:10.058 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:10.058 00:21:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:10.316 [2024-07-16 00:21:57.190107] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:10.316 [2024-07-16 00:21:57.190156] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:10.317 [2024-07-16 00:21:57.190172] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16766f0 00:27:10.317 [2024-07-16 00:21:57.190185] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:10.317 [2024-07-16 00:21:57.190548] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:10.317 [2024-07-16 00:21:57.190567] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:10.317 [2024-07-16 00:21:57.190628] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:10.317 [2024-07-16 00:21:57.190648] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:10.317 [2024-07-16 00:21:57.190745] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1677590 00:27:10.317 [2024-07-16 00:21:57.190755] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:10.317 [2024-07-16 00:21:57.190924] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d8540 00:27:10.317 [2024-07-16 00:21:57.191062] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1677590 00:27:10.317 [2024-07-16 00:21:57.191072] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1677590 00:27:10.317 [2024-07-16 00:21:57.191167] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:10.317 pt2 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.317 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:10.576 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:10.576 "name": "raid_bdev1", 00:27:10.576 "uuid": "c15b52cd-fa83-4da4-b304-f1e353d6ea4d", 00:27:10.576 "strip_size_kb": 0, 00:27:10.576 "state": "online", 00:27:10.576 "raid_level": "raid1", 00:27:10.576 "superblock": true, 00:27:10.576 "num_base_bdevs": 2, 00:27:10.576 "num_base_bdevs_discovered": 2, 00:27:10.576 "num_base_bdevs_operational": 2, 00:27:10.576 "base_bdevs_list": [ 00:27:10.576 { 00:27:10.576 "name": "pt1", 00:27:10.576 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:10.576 "is_configured": true, 00:27:10.576 "data_offset": 256, 00:27:10.576 "data_size": 7936 00:27:10.576 }, 00:27:10.576 { 00:27:10.576 "name": "pt2", 00:27:10.576 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:10.576 "is_configured": true, 00:27:10.576 "data_offset": 256, 00:27:10.576 "data_size": 7936 00:27:10.576 } 00:27:10.576 ] 00:27:10.576 }' 00:27:10.576 00:21:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:10.576 00:21:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:11.176 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:11.176 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:11.176 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:11.176 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:11.176 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:11.176 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:11.176 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:11.176 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:11.434 [2024-07-16 00:21:58.305320] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:11.434 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:11.434 "name": "raid_bdev1", 00:27:11.434 "aliases": [ 00:27:11.434 "c15b52cd-fa83-4da4-b304-f1e353d6ea4d" 00:27:11.434 ], 00:27:11.434 "product_name": "Raid Volume", 00:27:11.434 "block_size": 4096, 00:27:11.434 "num_blocks": 7936, 00:27:11.434 "uuid": "c15b52cd-fa83-4da4-b304-f1e353d6ea4d", 00:27:11.434 "assigned_rate_limits": { 00:27:11.434 "rw_ios_per_sec": 0, 00:27:11.434 "rw_mbytes_per_sec": 0, 00:27:11.434 "r_mbytes_per_sec": 0, 00:27:11.434 "w_mbytes_per_sec": 0 00:27:11.434 }, 00:27:11.434 "claimed": false, 00:27:11.434 "zoned": false, 00:27:11.435 "supported_io_types": { 00:27:11.435 "read": true, 00:27:11.435 "write": true, 00:27:11.435 "unmap": false, 00:27:11.435 "flush": false, 00:27:11.435 "reset": true, 00:27:11.435 "nvme_admin": false, 00:27:11.435 "nvme_io": false, 00:27:11.435 "nvme_io_md": false, 00:27:11.435 "write_zeroes": true, 00:27:11.435 "zcopy": false, 00:27:11.435 "get_zone_info": false, 00:27:11.435 "zone_management": false, 00:27:11.435 "zone_append": false, 00:27:11.435 "compare": false, 00:27:11.435 "compare_and_write": false, 00:27:11.435 "abort": false, 00:27:11.435 "seek_hole": false, 00:27:11.435 "seek_data": false, 00:27:11.435 "copy": false, 00:27:11.435 "nvme_iov_md": false 00:27:11.435 }, 00:27:11.435 "memory_domains": [ 00:27:11.435 { 00:27:11.435 "dma_device_id": "system", 00:27:11.435 "dma_device_type": 1 00:27:11.435 }, 00:27:11.435 { 00:27:11.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.435 "dma_device_type": 2 00:27:11.435 }, 00:27:11.435 { 00:27:11.435 "dma_device_id": "system", 00:27:11.435 "dma_device_type": 1 00:27:11.435 }, 00:27:11.435 { 00:27:11.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.435 "dma_device_type": 2 00:27:11.435 } 00:27:11.435 ], 00:27:11.435 "driver_specific": { 00:27:11.435 "raid": { 00:27:11.435 "uuid": "c15b52cd-fa83-4da4-b304-f1e353d6ea4d", 00:27:11.435 "strip_size_kb": 0, 00:27:11.435 "state": "online", 00:27:11.435 "raid_level": "raid1", 00:27:11.435 "superblock": true, 00:27:11.435 "num_base_bdevs": 2, 00:27:11.435 "num_base_bdevs_discovered": 2, 00:27:11.435 "num_base_bdevs_operational": 2, 00:27:11.435 "base_bdevs_list": [ 00:27:11.435 { 00:27:11.435 "name": "pt1", 00:27:11.435 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:11.435 "is_configured": true, 00:27:11.435 "data_offset": 256, 00:27:11.435 "data_size": 7936 00:27:11.435 }, 00:27:11.435 { 00:27:11.435 "name": "pt2", 00:27:11.435 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:11.435 "is_configured": true, 00:27:11.435 "data_offset": 256, 00:27:11.435 "data_size": 7936 00:27:11.435 } 00:27:11.435 ] 00:27:11.435 } 00:27:11.435 } 00:27:11.435 }' 00:27:11.435 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:11.435 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:11.435 pt2' 00:27:11.435 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:11.435 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:11.435 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:11.693 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:11.693 "name": "pt1", 00:27:11.693 "aliases": [ 00:27:11.693 "00000000-0000-0000-0000-000000000001" 00:27:11.693 ], 00:27:11.693 "product_name": "passthru", 00:27:11.693 "block_size": 4096, 00:27:11.693 "num_blocks": 8192, 00:27:11.693 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:11.693 "assigned_rate_limits": { 00:27:11.693 "rw_ios_per_sec": 0, 00:27:11.693 "rw_mbytes_per_sec": 0, 00:27:11.693 "r_mbytes_per_sec": 0, 00:27:11.693 "w_mbytes_per_sec": 0 00:27:11.693 }, 00:27:11.693 "claimed": true, 00:27:11.693 "claim_type": "exclusive_write", 00:27:11.693 "zoned": false, 00:27:11.693 "supported_io_types": { 00:27:11.693 "read": true, 00:27:11.693 "write": true, 00:27:11.693 "unmap": true, 00:27:11.693 "flush": true, 00:27:11.693 "reset": true, 00:27:11.693 "nvme_admin": false, 00:27:11.693 "nvme_io": false, 00:27:11.693 "nvme_io_md": false, 00:27:11.693 "write_zeroes": true, 00:27:11.693 "zcopy": true, 00:27:11.693 "get_zone_info": false, 00:27:11.693 "zone_management": false, 00:27:11.693 "zone_append": false, 00:27:11.693 "compare": false, 00:27:11.693 "compare_and_write": false, 00:27:11.693 "abort": true, 00:27:11.693 "seek_hole": false, 00:27:11.693 "seek_data": false, 00:27:11.693 "copy": true, 00:27:11.693 "nvme_iov_md": false 00:27:11.693 }, 00:27:11.693 "memory_domains": [ 00:27:11.693 { 00:27:11.693 "dma_device_id": "system", 00:27:11.693 "dma_device_type": 1 00:27:11.693 }, 00:27:11.693 { 00:27:11.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.693 "dma_device_type": 2 00:27:11.693 } 00:27:11.693 ], 00:27:11.693 "driver_specific": { 00:27:11.693 "passthru": { 00:27:11.693 "name": "pt1", 00:27:11.693 "base_bdev_name": "malloc1" 00:27:11.693 } 00:27:11.693 } 00:27:11.693 }' 00:27:11.693 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.951 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.951 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:11.951 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:11.951 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:11.951 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:11.951 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:11.951 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:11.951 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:11.951 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.209 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.209 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:12.209 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:12.209 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:12.209 00:21:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:12.466 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:12.466 "name": "pt2", 00:27:12.466 "aliases": [ 00:27:12.466 "00000000-0000-0000-0000-000000000002" 00:27:12.466 ], 00:27:12.466 "product_name": "passthru", 00:27:12.466 "block_size": 4096, 00:27:12.466 "num_blocks": 8192, 00:27:12.466 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:12.466 "assigned_rate_limits": { 00:27:12.466 "rw_ios_per_sec": 0, 00:27:12.466 "rw_mbytes_per_sec": 0, 00:27:12.466 "r_mbytes_per_sec": 0, 00:27:12.466 "w_mbytes_per_sec": 0 00:27:12.466 }, 00:27:12.466 "claimed": true, 00:27:12.466 "claim_type": "exclusive_write", 00:27:12.466 "zoned": false, 00:27:12.466 "supported_io_types": { 00:27:12.466 "read": true, 00:27:12.466 "write": true, 00:27:12.466 "unmap": true, 00:27:12.466 "flush": true, 00:27:12.466 "reset": true, 00:27:12.466 "nvme_admin": false, 00:27:12.466 "nvme_io": false, 00:27:12.466 "nvme_io_md": false, 00:27:12.466 "write_zeroes": true, 00:27:12.466 "zcopy": true, 00:27:12.466 "get_zone_info": false, 00:27:12.466 "zone_management": false, 00:27:12.466 "zone_append": false, 00:27:12.466 "compare": false, 00:27:12.466 "compare_and_write": false, 00:27:12.466 "abort": true, 00:27:12.466 "seek_hole": false, 00:27:12.466 "seek_data": false, 00:27:12.466 "copy": true, 00:27:12.466 "nvme_iov_md": false 00:27:12.466 }, 00:27:12.466 "memory_domains": [ 00:27:12.466 { 00:27:12.466 "dma_device_id": "system", 00:27:12.466 "dma_device_type": 1 00:27:12.466 }, 00:27:12.466 { 00:27:12.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:12.466 "dma_device_type": 2 00:27:12.466 } 00:27:12.466 ], 00:27:12.466 "driver_specific": { 00:27:12.466 "passthru": { 00:27:12.466 "name": "pt2", 00:27:12.466 "base_bdev_name": "malloc2" 00:27:12.466 } 00:27:12.466 } 00:27:12.466 }' 00:27:12.466 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:12.466 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:12.466 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:12.466 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:12.466 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:12.723 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:12.723 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.723 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.723 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:12.723 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.723 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.723 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:12.723 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:12.723 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:12.981 [2024-07-16 00:21:59.833352] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:12.981 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' c15b52cd-fa83-4da4-b304-f1e353d6ea4d '!=' c15b52cd-fa83-4da4-b304-f1e353d6ea4d ']' 00:27:12.981 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:12.981 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:12.981 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:12.981 00:21:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:13.239 [2024-07-16 00:22:00.085807] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:13.239 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:13.239 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:13.239 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:13.239 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:13.239 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:13.239 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:13.239 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:13.239 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:13.240 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:13.240 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:13.240 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.240 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.509 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:13.509 "name": "raid_bdev1", 00:27:13.509 "uuid": "c15b52cd-fa83-4da4-b304-f1e353d6ea4d", 00:27:13.509 "strip_size_kb": 0, 00:27:13.509 "state": "online", 00:27:13.509 "raid_level": "raid1", 00:27:13.509 "superblock": true, 00:27:13.509 "num_base_bdevs": 2, 00:27:13.509 "num_base_bdevs_discovered": 1, 00:27:13.509 "num_base_bdevs_operational": 1, 00:27:13.509 "base_bdevs_list": [ 00:27:13.509 { 00:27:13.509 "name": null, 00:27:13.509 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:13.509 "is_configured": false, 00:27:13.509 "data_offset": 256, 00:27:13.509 "data_size": 7936 00:27:13.509 }, 00:27:13.509 { 00:27:13.509 "name": "pt2", 00:27:13.509 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:13.509 "is_configured": true, 00:27:13.509 "data_offset": 256, 00:27:13.509 "data_size": 7936 00:27:13.509 } 00:27:13.509 ] 00:27:13.509 }' 00:27:13.509 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:13.509 00:22:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:14.073 00:22:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:14.331 [2024-07-16 00:22:01.180665] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:14.331 [2024-07-16 00:22:01.180693] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:14.331 [2024-07-16 00:22:01.180745] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:14.331 [2024-07-16 00:22:01.180784] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:14.331 [2024-07-16 00:22:01.180795] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1677590 name raid_bdev1, state offline 00:27:14.331 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.331 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:14.588 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:14.588 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:14.588 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:14.588 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:14.588 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:14.845 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:14.845 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:14.845 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:14.845 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:14.845 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:27:14.845 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:15.103 [2024-07-16 00:22:01.934641] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:15.103 [2024-07-16 00:22:01.934687] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:15.103 [2024-07-16 00:22:01.934704] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14df160 00:27:15.103 [2024-07-16 00:22:01.934717] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:15.103 [2024-07-16 00:22:01.936313] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:15.103 [2024-07-16 00:22:01.936341] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:15.103 [2024-07-16 00:22:01.936401] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:15.103 [2024-07-16 00:22:01.936427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:15.103 [2024-07-16 00:22:01.936509] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14d5380 00:27:15.103 [2024-07-16 00:22:01.936519] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:15.103 [2024-07-16 00:22:01.936682] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d6a80 00:27:15.103 [2024-07-16 00:22:01.936801] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14d5380 00:27:15.103 [2024-07-16 00:22:01.936811] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14d5380 00:27:15.103 [2024-07-16 00:22:01.936903] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:15.103 pt2 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.103 00:22:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.362 00:22:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:15.362 "name": "raid_bdev1", 00:27:15.362 "uuid": "c15b52cd-fa83-4da4-b304-f1e353d6ea4d", 00:27:15.362 "strip_size_kb": 0, 00:27:15.362 "state": "online", 00:27:15.362 "raid_level": "raid1", 00:27:15.362 "superblock": true, 00:27:15.362 "num_base_bdevs": 2, 00:27:15.362 "num_base_bdevs_discovered": 1, 00:27:15.362 "num_base_bdevs_operational": 1, 00:27:15.362 "base_bdevs_list": [ 00:27:15.362 { 00:27:15.362 "name": null, 00:27:15.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.362 "is_configured": false, 00:27:15.362 "data_offset": 256, 00:27:15.362 "data_size": 7936 00:27:15.362 }, 00:27:15.362 { 00:27:15.362 "name": "pt2", 00:27:15.362 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:15.362 "is_configured": true, 00:27:15.362 "data_offset": 256, 00:27:15.362 "data_size": 7936 00:27:15.362 } 00:27:15.362 ] 00:27:15.362 }' 00:27:15.362 00:22:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:15.362 00:22:02 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:15.930 00:22:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:16.188 [2024-07-16 00:22:03.029621] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:16.188 [2024-07-16 00:22:03.029647] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:16.188 [2024-07-16 00:22:03.029700] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:16.188 [2024-07-16 00:22:03.029742] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:16.188 [2024-07-16 00:22:03.029754] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d5380 name raid_bdev1, state offline 00:27:16.188 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.188 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:16.446 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:16.446 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:16.446 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:16.446 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:16.707 [2024-07-16 00:22:03.526921] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:16.707 [2024-07-16 00:22:03.526974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:16.707 [2024-07-16 00:22:03.526993] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1681520 00:27:16.707 [2024-07-16 00:22:03.527005] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:16.707 [2024-07-16 00:22:03.528618] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:16.707 [2024-07-16 00:22:03.528645] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:16.707 [2024-07-16 00:22:03.528708] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:16.707 [2024-07-16 00:22:03.528733] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:16.707 [2024-07-16 00:22:03.528830] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:16.707 [2024-07-16 00:22:03.528842] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:16.707 [2024-07-16 00:22:03.528855] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d63f0 name raid_bdev1, state configuring 00:27:16.707 [2024-07-16 00:22:03.528884] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:16.707 [2024-07-16 00:22:03.528950] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14d82b0 00:27:16.707 [2024-07-16 00:22:03.528961] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:16.707 [2024-07-16 00:22:03.529124] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d5350 00:27:16.707 [2024-07-16 00:22:03.529245] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14d82b0 00:27:16.707 [2024-07-16 00:22:03.529254] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14d82b0 00:27:16.707 [2024-07-16 00:22:03.529352] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:16.707 pt1 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.707 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.966 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:16.966 "name": "raid_bdev1", 00:27:16.966 "uuid": "c15b52cd-fa83-4da4-b304-f1e353d6ea4d", 00:27:16.966 "strip_size_kb": 0, 00:27:16.966 "state": "online", 00:27:16.966 "raid_level": "raid1", 00:27:16.966 "superblock": true, 00:27:16.966 "num_base_bdevs": 2, 00:27:16.966 "num_base_bdevs_discovered": 1, 00:27:16.966 "num_base_bdevs_operational": 1, 00:27:16.966 "base_bdevs_list": [ 00:27:16.966 { 00:27:16.966 "name": null, 00:27:16.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.966 "is_configured": false, 00:27:16.966 "data_offset": 256, 00:27:16.966 "data_size": 7936 00:27:16.966 }, 00:27:16.966 { 00:27:16.966 "name": "pt2", 00:27:16.966 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:16.966 "is_configured": true, 00:27:16.966 "data_offset": 256, 00:27:16.966 "data_size": 7936 00:27:16.966 } 00:27:16.966 ] 00:27:16.966 }' 00:27:16.966 00:22:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:16.966 00:22:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:17.532 00:22:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:17.532 00:22:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:17.791 00:22:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:17.791 00:22:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:17.791 00:22:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:18.050 [2024-07-16 00:22:04.826576] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' c15b52cd-fa83-4da4-b304-f1e353d6ea4d '!=' c15b52cd-fa83-4da4-b304-f1e353d6ea4d ']' 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 3632530 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 3632530 ']' 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 3632530 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3632530 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3632530' 00:27:18.050 killing process with pid 3632530 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 3632530 00:27:18.050 [2024-07-16 00:22:04.901198] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:18.050 [2024-07-16 00:22:04.901250] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:18.050 [2024-07-16 00:22:04.901291] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:18.050 [2024-07-16 00:22:04.901302] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d82b0 name raid_bdev1, state offline 00:27:18.050 00:22:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 3632530 00:27:18.050 [2024-07-16 00:22:04.917892] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:18.309 00:22:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:27:18.309 00:27:18.309 real 0m15.841s 00:27:18.309 user 0m28.622s 00:27:18.309 sys 0m3.041s 00:27:18.309 00:22:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:18.309 00:22:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:18.309 ************************************ 00:27:18.309 END TEST raid_superblock_test_4k 00:27:18.309 ************************************ 00:27:18.309 00:22:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:18.309 00:22:05 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:27:18.309 00:22:05 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:27:18.309 00:22:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:18.309 00:22:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:18.309 00:22:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:18.309 ************************************ 00:27:18.309 START TEST raid_rebuild_test_sb_4k 00:27:18.309 ************************************ 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:18.309 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=3634778 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 3634778 /var/tmp/spdk-raid.sock 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 3634778 ']' 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:18.310 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:18.310 00:22:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:18.569 [2024-07-16 00:22:05.295107] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:27:18.569 [2024-07-16 00:22:05.295166] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3634778 ] 00:27:18.569 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:18.569 Zero copy mechanism will not be used. 00:27:18.569 [2024-07-16 00:22:05.409148] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:18.569 [2024-07-16 00:22:05.514219] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:18.827 [2024-07-16 00:22:05.577470] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:18.828 [2024-07-16 00:22:05.577514] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:19.394 00:22:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:19.394 00:22:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:27:19.394 00:22:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:19.394 00:22:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:27:19.653 BaseBdev1_malloc 00:27:19.653 00:22:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:19.911 [2024-07-16 00:22:06.715242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:19.911 [2024-07-16 00:22:06.715292] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:19.911 [2024-07-16 00:22:06.715320] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa79d40 00:27:19.911 [2024-07-16 00:22:06.715333] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:19.911 [2024-07-16 00:22:06.717025] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:19.911 [2024-07-16 00:22:06.717055] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:19.911 BaseBdev1 00:27:19.911 00:22:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:19.911 00:22:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:27:20.169 BaseBdev2_malloc 00:27:20.169 00:22:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:20.427 [2024-07-16 00:22:07.241635] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:20.427 [2024-07-16 00:22:07.241683] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:20.427 [2024-07-16 00:22:07.241705] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa7a860 00:27:20.427 [2024-07-16 00:22:07.241718] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:20.427 [2024-07-16 00:22:07.243170] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:20.427 [2024-07-16 00:22:07.243199] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:20.427 BaseBdev2 00:27:20.427 00:22:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:27:20.685 spare_malloc 00:27:20.685 00:22:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:20.943 spare_delay 00:27:20.943 00:22:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:21.202 [2024-07-16 00:22:08.024405] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:21.202 [2024-07-16 00:22:08.024452] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:21.202 [2024-07-16 00:22:08.024472] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc28ec0 00:27:21.202 [2024-07-16 00:22:08.024485] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:21.202 [2024-07-16 00:22:08.025914] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:21.202 [2024-07-16 00:22:08.025950] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:21.202 spare 00:27:21.202 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:21.460 [2024-07-16 00:22:08.273088] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:21.460 [2024-07-16 00:22:08.274259] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:21.460 [2024-07-16 00:22:08.274420] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc2a070 00:27:21.460 [2024-07-16 00:22:08.274433] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:21.460 [2024-07-16 00:22:08.274616] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc23490 00:27:21.460 [2024-07-16 00:22:08.274750] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc2a070 00:27:21.461 [2024-07-16 00:22:08.274760] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc2a070 00:27:21.461 [2024-07-16 00:22:08.274851] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.461 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.720 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.720 "name": "raid_bdev1", 00:27:21.720 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:21.720 "strip_size_kb": 0, 00:27:21.720 "state": "online", 00:27:21.720 "raid_level": "raid1", 00:27:21.720 "superblock": true, 00:27:21.720 "num_base_bdevs": 2, 00:27:21.720 "num_base_bdevs_discovered": 2, 00:27:21.720 "num_base_bdevs_operational": 2, 00:27:21.720 "base_bdevs_list": [ 00:27:21.720 { 00:27:21.720 "name": "BaseBdev1", 00:27:21.720 "uuid": "fdf6ef36-f83b-53c5-8da4-e404c5517244", 00:27:21.720 "is_configured": true, 00:27:21.720 "data_offset": 256, 00:27:21.720 "data_size": 7936 00:27:21.720 }, 00:27:21.720 { 00:27:21.720 "name": "BaseBdev2", 00:27:21.720 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:21.720 "is_configured": true, 00:27:21.720 "data_offset": 256, 00:27:21.720 "data_size": 7936 00:27:21.720 } 00:27:21.720 ] 00:27:21.720 }' 00:27:21.720 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.720 00:22:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:22.286 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:22.286 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:22.544 [2024-07-16 00:22:09.344151] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:22.544 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:22.544 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.544 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:22.802 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:23.061 [2024-07-16 00:22:09.865326] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc23490 00:27:23.061 /dev/nbd0 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:23.061 1+0 records in 00:27:23.061 1+0 records out 00:27:23.061 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282129 s, 14.5 MB/s 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:23.061 00:22:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:23.996 7936+0 records in 00:27:23.996 7936+0 records out 00:27:23.996 32505856 bytes (33 MB, 31 MiB) copied, 0.73765 s, 44.1 MB/s 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:23.996 [2024-07-16 00:22:10.933349] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:23.996 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:24.254 00:22:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:24.254 [2024-07-16 00:22:11.170042] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:24.254 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:24.254 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:24.254 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:24.254 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:24.255 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:24.255 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:24.255 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:24.255 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:24.255 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:24.255 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:24.513 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.513 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:24.771 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:24.771 "name": "raid_bdev1", 00:27:24.771 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:24.771 "strip_size_kb": 0, 00:27:24.771 "state": "online", 00:27:24.771 "raid_level": "raid1", 00:27:24.771 "superblock": true, 00:27:24.771 "num_base_bdevs": 2, 00:27:24.771 "num_base_bdevs_discovered": 1, 00:27:24.771 "num_base_bdevs_operational": 1, 00:27:24.771 "base_bdevs_list": [ 00:27:24.771 { 00:27:24.771 "name": null, 00:27:24.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.771 "is_configured": false, 00:27:24.771 "data_offset": 256, 00:27:24.771 "data_size": 7936 00:27:24.771 }, 00:27:24.771 { 00:27:24.771 "name": "BaseBdev2", 00:27:24.771 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:24.771 "is_configured": true, 00:27:24.771 "data_offset": 256, 00:27:24.771 "data_size": 7936 00:27:24.771 } 00:27:24.771 ] 00:27:24.771 }' 00:27:24.771 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:24.771 00:22:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:25.338 00:22:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:25.338 [2024-07-16 00:22:12.289002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:25.596 [2024-07-16 00:22:12.294243] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc29ce0 00:27:25.596 [2024-07-16 00:22:12.296506] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:25.596 00:22:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:26.530 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:26.530 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:26.530 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:26.530 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:26.530 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:26.530 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.530 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.786 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:26.786 "name": "raid_bdev1", 00:27:26.786 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:26.786 "strip_size_kb": 0, 00:27:26.786 "state": "online", 00:27:26.786 "raid_level": "raid1", 00:27:26.786 "superblock": true, 00:27:26.786 "num_base_bdevs": 2, 00:27:26.786 "num_base_bdevs_discovered": 2, 00:27:26.786 "num_base_bdevs_operational": 2, 00:27:26.786 "process": { 00:27:26.786 "type": "rebuild", 00:27:26.786 "target": "spare", 00:27:26.786 "progress": { 00:27:26.786 "blocks": 3072, 00:27:26.786 "percent": 38 00:27:26.786 } 00:27:26.786 }, 00:27:26.786 "base_bdevs_list": [ 00:27:26.786 { 00:27:26.786 "name": "spare", 00:27:26.787 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:26.787 "is_configured": true, 00:27:26.787 "data_offset": 256, 00:27:26.787 "data_size": 7936 00:27:26.787 }, 00:27:26.787 { 00:27:26.787 "name": "BaseBdev2", 00:27:26.787 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:26.787 "is_configured": true, 00:27:26.787 "data_offset": 256, 00:27:26.787 "data_size": 7936 00:27:26.787 } 00:27:26.787 ] 00:27:26.787 }' 00:27:26.787 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:26.787 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:26.787 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:26.787 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:26.787 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:27.044 [2024-07-16 00:22:13.846293] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:27.044 [2024-07-16 00:22:13.909234] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:27.044 [2024-07-16 00:22:13.909282] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:27.044 [2024-07-16 00:22:13.909298] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:27.044 [2024-07-16 00:22:13.909306] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.044 00:22:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:27.301 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:27.301 "name": "raid_bdev1", 00:27:27.301 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:27.301 "strip_size_kb": 0, 00:27:27.301 "state": "online", 00:27:27.301 "raid_level": "raid1", 00:27:27.301 "superblock": true, 00:27:27.301 "num_base_bdevs": 2, 00:27:27.301 "num_base_bdevs_discovered": 1, 00:27:27.301 "num_base_bdevs_operational": 1, 00:27:27.301 "base_bdevs_list": [ 00:27:27.301 { 00:27:27.301 "name": null, 00:27:27.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.301 "is_configured": false, 00:27:27.301 "data_offset": 256, 00:27:27.301 "data_size": 7936 00:27:27.301 }, 00:27:27.301 { 00:27:27.301 "name": "BaseBdev2", 00:27:27.301 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:27.301 "is_configured": true, 00:27:27.301 "data_offset": 256, 00:27:27.301 "data_size": 7936 00:27:27.301 } 00:27:27.301 ] 00:27:27.301 }' 00:27:27.301 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:27.301 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:27.904 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:27.904 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:27.904 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:27.904 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:27.904 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:27.904 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.904 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:28.162 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:28.162 "name": "raid_bdev1", 00:27:28.162 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:28.162 "strip_size_kb": 0, 00:27:28.162 "state": "online", 00:27:28.162 "raid_level": "raid1", 00:27:28.162 "superblock": true, 00:27:28.162 "num_base_bdevs": 2, 00:27:28.162 "num_base_bdevs_discovered": 1, 00:27:28.162 "num_base_bdevs_operational": 1, 00:27:28.162 "base_bdevs_list": [ 00:27:28.162 { 00:27:28.162 "name": null, 00:27:28.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:28.162 "is_configured": false, 00:27:28.162 "data_offset": 256, 00:27:28.162 "data_size": 7936 00:27:28.162 }, 00:27:28.162 { 00:27:28.162 "name": "BaseBdev2", 00:27:28.162 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:28.162 "is_configured": true, 00:27:28.162 "data_offset": 256, 00:27:28.162 "data_size": 7936 00:27:28.162 } 00:27:28.162 ] 00:27:28.162 }' 00:27:28.162 00:22:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:28.162 00:22:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:28.162 00:22:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:28.162 00:22:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:28.162 00:22:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:28.420 [2024-07-16 00:22:15.301741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:28.420 [2024-07-16 00:22:15.307490] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc29ce0 00:27:28.420 [2024-07-16 00:22:15.309019] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:28.420 00:22:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:29.793 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:29.793 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:29.793 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:29.793 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:29.794 "name": "raid_bdev1", 00:27:29.794 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:29.794 "strip_size_kb": 0, 00:27:29.794 "state": "online", 00:27:29.794 "raid_level": "raid1", 00:27:29.794 "superblock": true, 00:27:29.794 "num_base_bdevs": 2, 00:27:29.794 "num_base_bdevs_discovered": 2, 00:27:29.794 "num_base_bdevs_operational": 2, 00:27:29.794 "process": { 00:27:29.794 "type": "rebuild", 00:27:29.794 "target": "spare", 00:27:29.794 "progress": { 00:27:29.794 "blocks": 3072, 00:27:29.794 "percent": 38 00:27:29.794 } 00:27:29.794 }, 00:27:29.794 "base_bdevs_list": [ 00:27:29.794 { 00:27:29.794 "name": "spare", 00:27:29.794 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:29.794 "is_configured": true, 00:27:29.794 "data_offset": 256, 00:27:29.794 "data_size": 7936 00:27:29.794 }, 00:27:29.794 { 00:27:29.794 "name": "BaseBdev2", 00:27:29.794 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:29.794 "is_configured": true, 00:27:29.794 "data_offset": 256, 00:27:29.794 "data_size": 7936 00:27:29.794 } 00:27:29.794 ] 00:27:29.794 }' 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:29.794 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1046 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.794 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:30.052 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:30.052 "name": "raid_bdev1", 00:27:30.052 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:30.052 "strip_size_kb": 0, 00:27:30.052 "state": "online", 00:27:30.052 "raid_level": "raid1", 00:27:30.052 "superblock": true, 00:27:30.052 "num_base_bdevs": 2, 00:27:30.052 "num_base_bdevs_discovered": 2, 00:27:30.052 "num_base_bdevs_operational": 2, 00:27:30.052 "process": { 00:27:30.052 "type": "rebuild", 00:27:30.052 "target": "spare", 00:27:30.052 "progress": { 00:27:30.052 "blocks": 3840, 00:27:30.052 "percent": 48 00:27:30.052 } 00:27:30.052 }, 00:27:30.052 "base_bdevs_list": [ 00:27:30.052 { 00:27:30.052 "name": "spare", 00:27:30.052 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:30.052 "is_configured": true, 00:27:30.052 "data_offset": 256, 00:27:30.052 "data_size": 7936 00:27:30.052 }, 00:27:30.052 { 00:27:30.052 "name": "BaseBdev2", 00:27:30.052 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:30.052 "is_configured": true, 00:27:30.052 "data_offset": 256, 00:27:30.052 "data_size": 7936 00:27:30.052 } 00:27:30.052 ] 00:27:30.052 }' 00:27:30.052 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:30.052 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:30.052 00:22:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:30.311 00:22:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:30.311 00:22:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:31.246 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:31.246 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:31.246 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:31.246 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:31.246 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:31.246 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:31.246 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.246 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.504 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:31.504 "name": "raid_bdev1", 00:27:31.504 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:31.504 "strip_size_kb": 0, 00:27:31.504 "state": "online", 00:27:31.504 "raid_level": "raid1", 00:27:31.504 "superblock": true, 00:27:31.504 "num_base_bdevs": 2, 00:27:31.504 "num_base_bdevs_discovered": 2, 00:27:31.504 "num_base_bdevs_operational": 2, 00:27:31.504 "process": { 00:27:31.504 "type": "rebuild", 00:27:31.504 "target": "spare", 00:27:31.504 "progress": { 00:27:31.504 "blocks": 7424, 00:27:31.504 "percent": 93 00:27:31.504 } 00:27:31.504 }, 00:27:31.504 "base_bdevs_list": [ 00:27:31.504 { 00:27:31.504 "name": "spare", 00:27:31.504 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:31.504 "is_configured": true, 00:27:31.504 "data_offset": 256, 00:27:31.504 "data_size": 7936 00:27:31.504 }, 00:27:31.504 { 00:27:31.504 "name": "BaseBdev2", 00:27:31.504 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:31.504 "is_configured": true, 00:27:31.504 "data_offset": 256, 00:27:31.504 "data_size": 7936 00:27:31.504 } 00:27:31.504 ] 00:27:31.504 }' 00:27:31.504 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:31.504 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:31.504 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:31.504 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:31.504 00:22:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:31.504 [2024-07-16 00:22:18.433892] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:31.504 [2024-07-16 00:22:18.433979] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:31.504 [2024-07-16 00:22:18.434067] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:32.462 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:32.462 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:32.462 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:32.462 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:32.462 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:32.462 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:32.462 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.462 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.720 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:32.720 "name": "raid_bdev1", 00:27:32.720 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:32.720 "strip_size_kb": 0, 00:27:32.720 "state": "online", 00:27:32.720 "raid_level": "raid1", 00:27:32.720 "superblock": true, 00:27:32.720 "num_base_bdevs": 2, 00:27:32.720 "num_base_bdevs_discovered": 2, 00:27:32.720 "num_base_bdevs_operational": 2, 00:27:32.720 "base_bdevs_list": [ 00:27:32.720 { 00:27:32.720 "name": "spare", 00:27:32.720 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:32.720 "is_configured": true, 00:27:32.720 "data_offset": 256, 00:27:32.720 "data_size": 7936 00:27:32.720 }, 00:27:32.720 { 00:27:32.720 "name": "BaseBdev2", 00:27:32.720 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:32.720 "is_configured": true, 00:27:32.720 "data_offset": 256, 00:27:32.720 "data_size": 7936 00:27:32.720 } 00:27:32.720 ] 00:27:32.720 }' 00:27:32.720 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.979 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.237 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:33.237 "name": "raid_bdev1", 00:27:33.237 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:33.237 "strip_size_kb": 0, 00:27:33.237 "state": "online", 00:27:33.237 "raid_level": "raid1", 00:27:33.237 "superblock": true, 00:27:33.237 "num_base_bdevs": 2, 00:27:33.237 "num_base_bdevs_discovered": 2, 00:27:33.237 "num_base_bdevs_operational": 2, 00:27:33.237 "base_bdevs_list": [ 00:27:33.237 { 00:27:33.237 "name": "spare", 00:27:33.237 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:33.237 "is_configured": true, 00:27:33.237 "data_offset": 256, 00:27:33.237 "data_size": 7936 00:27:33.237 }, 00:27:33.237 { 00:27:33.237 "name": "BaseBdev2", 00:27:33.237 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:33.237 "is_configured": true, 00:27:33.237 "data_offset": 256, 00:27:33.237 "data_size": 7936 00:27:33.237 } 00:27:33.237 ] 00:27:33.237 }' 00:27:33.237 00:22:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.237 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:33.496 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:33.496 "name": "raid_bdev1", 00:27:33.496 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:33.496 "strip_size_kb": 0, 00:27:33.496 "state": "online", 00:27:33.496 "raid_level": "raid1", 00:27:33.496 "superblock": true, 00:27:33.496 "num_base_bdevs": 2, 00:27:33.496 "num_base_bdevs_discovered": 2, 00:27:33.496 "num_base_bdevs_operational": 2, 00:27:33.496 "base_bdevs_list": [ 00:27:33.496 { 00:27:33.496 "name": "spare", 00:27:33.496 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:33.496 "is_configured": true, 00:27:33.496 "data_offset": 256, 00:27:33.496 "data_size": 7936 00:27:33.496 }, 00:27:33.496 { 00:27:33.496 "name": "BaseBdev2", 00:27:33.496 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:33.496 "is_configured": true, 00:27:33.496 "data_offset": 256, 00:27:33.496 "data_size": 7936 00:27:33.496 } 00:27:33.496 ] 00:27:33.496 }' 00:27:33.496 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:33.496 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:34.064 00:22:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:34.322 [2024-07-16 00:22:21.170315] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:34.322 [2024-07-16 00:22:21.170345] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:34.322 [2024-07-16 00:22:21.170405] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:34.322 [2024-07-16 00:22:21.170459] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:34.322 [2024-07-16 00:22:21.170471] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc2a070 name raid_bdev1, state offline 00:27:34.322 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.322 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:34.581 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:34.840 /dev/nbd0 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:34.840 1+0 records in 00:27:34.840 1+0 records out 00:27:34.840 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262669 s, 15.6 MB/s 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:34.840 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:35.099 /dev/nbd1 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:35.099 1+0 records in 00:27:35.099 1+0 records out 00:27:35.099 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000348402 s, 11.8 MB/s 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:35.099 00:22:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:35.099 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:35.099 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:35.099 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:35.099 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:35.099 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:35.099 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:35.099 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:35.357 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:35.357 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:35.357 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:35.357 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:35.357 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:35.357 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:35.357 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:35.357 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:35.357 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:35.357 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:35.615 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:35.615 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:35.615 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:35.615 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:35.615 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:35.615 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:35.873 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:35.873 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:35.873 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:35.873 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:35.873 00:22:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:36.131 [2024-07-16 00:22:22.984666] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:36.131 [2024-07-16 00:22:22.984716] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:36.131 [2024-07-16 00:22:22.984737] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc29500 00:27:36.131 [2024-07-16 00:22:22.984750] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:36.131 [2024-07-16 00:22:22.986404] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:36.131 [2024-07-16 00:22:22.986435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:36.131 [2024-07-16 00:22:22.986521] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:36.131 [2024-07-16 00:22:22.986547] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:36.131 [2024-07-16 00:22:22.986649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:36.131 spare 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.131 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.389 [2024-07-16 00:22:23.086968] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc28260 00:27:36.389 [2024-07-16 00:22:23.086991] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:36.389 [2024-07-16 00:22:23.087207] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc22f50 00:27:36.389 [2024-07-16 00:22:23.087367] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc28260 00:27:36.389 [2024-07-16 00:22:23.087377] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc28260 00:27:36.389 [2024-07-16 00:22:23.087487] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:36.389 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:36.389 "name": "raid_bdev1", 00:27:36.389 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:36.389 "strip_size_kb": 0, 00:27:36.389 "state": "online", 00:27:36.389 "raid_level": "raid1", 00:27:36.389 "superblock": true, 00:27:36.389 "num_base_bdevs": 2, 00:27:36.389 "num_base_bdevs_discovered": 2, 00:27:36.389 "num_base_bdevs_operational": 2, 00:27:36.389 "base_bdevs_list": [ 00:27:36.389 { 00:27:36.389 "name": "spare", 00:27:36.389 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:36.389 "is_configured": true, 00:27:36.389 "data_offset": 256, 00:27:36.389 "data_size": 7936 00:27:36.389 }, 00:27:36.389 { 00:27:36.389 "name": "BaseBdev2", 00:27:36.389 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:36.389 "is_configured": true, 00:27:36.389 "data_offset": 256, 00:27:36.389 "data_size": 7936 00:27:36.389 } 00:27:36.389 ] 00:27:36.389 }' 00:27:36.390 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:36.390 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:36.954 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:36.954 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:36.954 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:36.954 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:36.954 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:36.954 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.954 00:22:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:37.213 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:37.213 "name": "raid_bdev1", 00:27:37.213 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:37.213 "strip_size_kb": 0, 00:27:37.213 "state": "online", 00:27:37.213 "raid_level": "raid1", 00:27:37.213 "superblock": true, 00:27:37.213 "num_base_bdevs": 2, 00:27:37.213 "num_base_bdevs_discovered": 2, 00:27:37.213 "num_base_bdevs_operational": 2, 00:27:37.213 "base_bdevs_list": [ 00:27:37.213 { 00:27:37.213 "name": "spare", 00:27:37.213 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:37.213 "is_configured": true, 00:27:37.213 "data_offset": 256, 00:27:37.213 "data_size": 7936 00:27:37.213 }, 00:27:37.213 { 00:27:37.213 "name": "BaseBdev2", 00:27:37.213 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:37.213 "is_configured": true, 00:27:37.213 "data_offset": 256, 00:27:37.213 "data_size": 7936 00:27:37.213 } 00:27:37.213 ] 00:27:37.213 }' 00:27:37.213 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:37.213 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:37.213 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:37.472 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:37.472 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.472 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:37.472 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:37.472 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:37.731 [2024-07-16 00:22:24.633177] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.731 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:37.990 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:37.990 "name": "raid_bdev1", 00:27:37.990 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:37.990 "strip_size_kb": 0, 00:27:37.990 "state": "online", 00:27:37.990 "raid_level": "raid1", 00:27:37.990 "superblock": true, 00:27:37.990 "num_base_bdevs": 2, 00:27:37.990 "num_base_bdevs_discovered": 1, 00:27:37.990 "num_base_bdevs_operational": 1, 00:27:37.990 "base_bdevs_list": [ 00:27:37.990 { 00:27:37.990 "name": null, 00:27:37.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:37.990 "is_configured": false, 00:27:37.990 "data_offset": 256, 00:27:37.990 "data_size": 7936 00:27:37.990 }, 00:27:37.990 { 00:27:37.990 "name": "BaseBdev2", 00:27:37.990 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:37.990 "is_configured": true, 00:27:37.990 "data_offset": 256, 00:27:37.990 "data_size": 7936 00:27:37.990 } 00:27:37.990 ] 00:27:37.990 }' 00:27:37.990 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:37.990 00:22:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:38.559 00:22:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:38.817 [2024-07-16 00:22:25.720075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:38.817 [2024-07-16 00:22:25.720235] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:38.817 [2024-07-16 00:22:25.720252] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:38.817 [2024-07-16 00:22:25.720281] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:38.817 [2024-07-16 00:22:25.725118] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc22f50 00:27:38.817 [2024-07-16 00:22:25.727498] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:38.817 00:22:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:40.193 00:22:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:40.193 00:22:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:40.193 00:22:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:40.193 00:22:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:40.193 00:22:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:40.193 00:22:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.193 00:22:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:40.193 00:22:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:40.193 "name": "raid_bdev1", 00:27:40.193 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:40.193 "strip_size_kb": 0, 00:27:40.193 "state": "online", 00:27:40.193 "raid_level": "raid1", 00:27:40.193 "superblock": true, 00:27:40.193 "num_base_bdevs": 2, 00:27:40.193 "num_base_bdevs_discovered": 2, 00:27:40.193 "num_base_bdevs_operational": 2, 00:27:40.193 "process": { 00:27:40.193 "type": "rebuild", 00:27:40.193 "target": "spare", 00:27:40.193 "progress": { 00:27:40.193 "blocks": 3072, 00:27:40.193 "percent": 38 00:27:40.193 } 00:27:40.193 }, 00:27:40.193 "base_bdevs_list": [ 00:27:40.193 { 00:27:40.193 "name": "spare", 00:27:40.193 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:40.193 "is_configured": true, 00:27:40.193 "data_offset": 256, 00:27:40.193 "data_size": 7936 00:27:40.193 }, 00:27:40.193 { 00:27:40.193 "name": "BaseBdev2", 00:27:40.193 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:40.193 "is_configured": true, 00:27:40.193 "data_offset": 256, 00:27:40.193 "data_size": 7936 00:27:40.193 } 00:27:40.193 ] 00:27:40.193 }' 00:27:40.193 00:22:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:40.193 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:40.193 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:40.193 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:40.193 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:40.452 [2024-07-16 00:22:27.305597] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:40.452 [2024-07-16 00:22:27.340196] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:40.452 [2024-07-16 00:22:27.340241] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:40.452 [2024-07-16 00:22:27.340256] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:40.452 [2024-07-16 00:22:27.340265] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.452 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:40.710 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:40.710 "name": "raid_bdev1", 00:27:40.710 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:40.710 "strip_size_kb": 0, 00:27:40.710 "state": "online", 00:27:40.710 "raid_level": "raid1", 00:27:40.710 "superblock": true, 00:27:40.710 "num_base_bdevs": 2, 00:27:40.710 "num_base_bdevs_discovered": 1, 00:27:40.710 "num_base_bdevs_operational": 1, 00:27:40.710 "base_bdevs_list": [ 00:27:40.710 { 00:27:40.710 "name": null, 00:27:40.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:40.710 "is_configured": false, 00:27:40.710 "data_offset": 256, 00:27:40.710 "data_size": 7936 00:27:40.710 }, 00:27:40.710 { 00:27:40.710 "name": "BaseBdev2", 00:27:40.710 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:40.710 "is_configured": true, 00:27:40.710 "data_offset": 256, 00:27:40.710 "data_size": 7936 00:27:40.710 } 00:27:40.710 ] 00:27:40.710 }' 00:27:40.710 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:40.710 00:22:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:41.275 00:22:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:41.533 [2024-07-16 00:22:28.431614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:41.533 [2024-07-16 00:22:28.431673] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:41.533 [2024-07-16 00:22:28.431698] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc29730 00:27:41.533 [2024-07-16 00:22:28.431711] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:41.533 [2024-07-16 00:22:28.432131] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:41.533 [2024-07-16 00:22:28.432151] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:41.533 [2024-07-16 00:22:28.432238] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:41.533 [2024-07-16 00:22:28.432250] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:41.533 [2024-07-16 00:22:28.432261] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:41.533 [2024-07-16 00:22:28.432279] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:41.533 [2024-07-16 00:22:28.437866] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc2aaa0 00:27:41.533 spare 00:27:41.533 [2024-07-16 00:22:28.439389] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:41.533 00:22:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:42.909 "name": "raid_bdev1", 00:27:42.909 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:42.909 "strip_size_kb": 0, 00:27:42.909 "state": "online", 00:27:42.909 "raid_level": "raid1", 00:27:42.909 "superblock": true, 00:27:42.909 "num_base_bdevs": 2, 00:27:42.909 "num_base_bdevs_discovered": 2, 00:27:42.909 "num_base_bdevs_operational": 2, 00:27:42.909 "process": { 00:27:42.909 "type": "rebuild", 00:27:42.909 "target": "spare", 00:27:42.909 "progress": { 00:27:42.909 "blocks": 3072, 00:27:42.909 "percent": 38 00:27:42.909 } 00:27:42.909 }, 00:27:42.909 "base_bdevs_list": [ 00:27:42.909 { 00:27:42.909 "name": "spare", 00:27:42.909 "uuid": "0379fe44-b7ea-5153-aac3-2cee89dca318", 00:27:42.909 "is_configured": true, 00:27:42.909 "data_offset": 256, 00:27:42.909 "data_size": 7936 00:27:42.909 }, 00:27:42.909 { 00:27:42.909 "name": "BaseBdev2", 00:27:42.909 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:42.909 "is_configured": true, 00:27:42.909 "data_offset": 256, 00:27:42.909 "data_size": 7936 00:27:42.909 } 00:27:42.909 ] 00:27:42.909 }' 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:42.909 00:22:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:43.168 [2024-07-16 00:22:30.022680] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:43.168 [2024-07-16 00:22:30.052072] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:43.168 [2024-07-16 00:22:30.052121] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:43.168 [2024-07-16 00:22:30.052137] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:43.168 [2024-07-16 00:22:30.052146] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.168 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.426 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:43.426 "name": "raid_bdev1", 00:27:43.426 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:43.426 "strip_size_kb": 0, 00:27:43.426 "state": "online", 00:27:43.426 "raid_level": "raid1", 00:27:43.426 "superblock": true, 00:27:43.426 "num_base_bdevs": 2, 00:27:43.426 "num_base_bdevs_discovered": 1, 00:27:43.426 "num_base_bdevs_operational": 1, 00:27:43.426 "base_bdevs_list": [ 00:27:43.426 { 00:27:43.426 "name": null, 00:27:43.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.426 "is_configured": false, 00:27:43.426 "data_offset": 256, 00:27:43.426 "data_size": 7936 00:27:43.426 }, 00:27:43.426 { 00:27:43.426 "name": "BaseBdev2", 00:27:43.426 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:43.426 "is_configured": true, 00:27:43.426 "data_offset": 256, 00:27:43.426 "data_size": 7936 00:27:43.426 } 00:27:43.426 ] 00:27:43.426 }' 00:27:43.426 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:43.426 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:43.992 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:43.992 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:43.992 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:43.992 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:43.993 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:43.993 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.993 00:22:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.251 00:22:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:44.251 "name": "raid_bdev1", 00:27:44.251 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:44.251 "strip_size_kb": 0, 00:27:44.251 "state": "online", 00:27:44.251 "raid_level": "raid1", 00:27:44.251 "superblock": true, 00:27:44.251 "num_base_bdevs": 2, 00:27:44.251 "num_base_bdevs_discovered": 1, 00:27:44.251 "num_base_bdevs_operational": 1, 00:27:44.251 "base_bdevs_list": [ 00:27:44.251 { 00:27:44.251 "name": null, 00:27:44.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:44.251 "is_configured": false, 00:27:44.251 "data_offset": 256, 00:27:44.251 "data_size": 7936 00:27:44.251 }, 00:27:44.251 { 00:27:44.251 "name": "BaseBdev2", 00:27:44.251 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:44.251 "is_configured": true, 00:27:44.251 "data_offset": 256, 00:27:44.251 "data_size": 7936 00:27:44.251 } 00:27:44.251 ] 00:27:44.251 }' 00:27:44.252 00:22:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:44.510 00:22:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:44.510 00:22:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:44.510 00:22:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:44.510 00:22:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:44.769 00:22:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:45.027 [2024-07-16 00:22:31.745294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:45.027 [2024-07-16 00:22:31.745346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:45.027 [2024-07-16 00:22:31.745368] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc24650 00:27:45.027 [2024-07-16 00:22:31.745381] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:45.027 [2024-07-16 00:22:31.745742] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:45.027 [2024-07-16 00:22:31.745761] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:45.027 [2024-07-16 00:22:31.745828] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:45.027 [2024-07-16 00:22:31.745840] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:45.027 [2024-07-16 00:22:31.745850] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:45.027 BaseBdev1 00:27:45.027 00:22:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.020 00:22:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.277 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:46.277 "name": "raid_bdev1", 00:27:46.277 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:46.277 "strip_size_kb": 0, 00:27:46.277 "state": "online", 00:27:46.277 "raid_level": "raid1", 00:27:46.277 "superblock": true, 00:27:46.277 "num_base_bdevs": 2, 00:27:46.277 "num_base_bdevs_discovered": 1, 00:27:46.277 "num_base_bdevs_operational": 1, 00:27:46.277 "base_bdevs_list": [ 00:27:46.277 { 00:27:46.277 "name": null, 00:27:46.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:46.277 "is_configured": false, 00:27:46.277 "data_offset": 256, 00:27:46.277 "data_size": 7936 00:27:46.277 }, 00:27:46.277 { 00:27:46.277 "name": "BaseBdev2", 00:27:46.277 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:46.277 "is_configured": true, 00:27:46.277 "data_offset": 256, 00:27:46.277 "data_size": 7936 00:27:46.277 } 00:27:46.277 ] 00:27:46.277 }' 00:27:46.277 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:46.277 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:46.841 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:46.841 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.841 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:46.841 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:46.841 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.841 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.841 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:47.098 "name": "raid_bdev1", 00:27:47.098 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:47.098 "strip_size_kb": 0, 00:27:47.098 "state": "online", 00:27:47.098 "raid_level": "raid1", 00:27:47.098 "superblock": true, 00:27:47.098 "num_base_bdevs": 2, 00:27:47.098 "num_base_bdevs_discovered": 1, 00:27:47.098 "num_base_bdevs_operational": 1, 00:27:47.098 "base_bdevs_list": [ 00:27:47.098 { 00:27:47.098 "name": null, 00:27:47.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:47.098 "is_configured": false, 00:27:47.098 "data_offset": 256, 00:27:47.098 "data_size": 7936 00:27:47.098 }, 00:27:47.098 { 00:27:47.098 "name": "BaseBdev2", 00:27:47.098 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:47.098 "is_configured": true, 00:27:47.098 "data_offset": 256, 00:27:47.098 "data_size": 7936 00:27:47.098 } 00:27:47.098 ] 00:27:47.098 }' 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:47.098 00:22:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:47.356 [2024-07-16 00:22:34.203822] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:47.356 [2024-07-16 00:22:34.203965] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:47.356 [2024-07-16 00:22:34.203982] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:47.356 request: 00:27:47.356 { 00:27:47.356 "base_bdev": "BaseBdev1", 00:27:47.356 "raid_bdev": "raid_bdev1", 00:27:47.356 "method": "bdev_raid_add_base_bdev", 00:27:47.356 "req_id": 1 00:27:47.356 } 00:27:47.356 Got JSON-RPC error response 00:27:47.356 response: 00:27:47.356 { 00:27:47.356 "code": -22, 00:27:47.356 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:47.356 } 00:27:47.356 00:22:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:27:47.356 00:22:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:47.356 00:22:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:47.356 00:22:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:47.356 00:22:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.304 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:48.562 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:48.562 "name": "raid_bdev1", 00:27:48.562 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:48.562 "strip_size_kb": 0, 00:27:48.562 "state": "online", 00:27:48.562 "raid_level": "raid1", 00:27:48.562 "superblock": true, 00:27:48.562 "num_base_bdevs": 2, 00:27:48.562 "num_base_bdevs_discovered": 1, 00:27:48.562 "num_base_bdevs_operational": 1, 00:27:48.562 "base_bdevs_list": [ 00:27:48.562 { 00:27:48.562 "name": null, 00:27:48.562 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.562 "is_configured": false, 00:27:48.562 "data_offset": 256, 00:27:48.562 "data_size": 7936 00:27:48.562 }, 00:27:48.562 { 00:27:48.562 "name": "BaseBdev2", 00:27:48.562 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:48.562 "is_configured": true, 00:27:48.562 "data_offset": 256, 00:27:48.562 "data_size": 7936 00:27:48.562 } 00:27:48.562 ] 00:27:48.562 }' 00:27:48.562 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:48.562 00:22:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:49.494 "name": "raid_bdev1", 00:27:49.494 "uuid": "aeea315d-efc3-46a1-8871-18fc4ad683ac", 00:27:49.494 "strip_size_kb": 0, 00:27:49.494 "state": "online", 00:27:49.494 "raid_level": "raid1", 00:27:49.494 "superblock": true, 00:27:49.494 "num_base_bdevs": 2, 00:27:49.494 "num_base_bdevs_discovered": 1, 00:27:49.494 "num_base_bdevs_operational": 1, 00:27:49.494 "base_bdevs_list": [ 00:27:49.494 { 00:27:49.494 "name": null, 00:27:49.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:49.494 "is_configured": false, 00:27:49.494 "data_offset": 256, 00:27:49.494 "data_size": 7936 00:27:49.494 }, 00:27:49.494 { 00:27:49.494 "name": "BaseBdev2", 00:27:49.494 "uuid": "069ca0f8-878f-539b-8a17-fd75ebf1e75c", 00:27:49.494 "is_configured": true, 00:27:49.494 "data_offset": 256, 00:27:49.494 "data_size": 7936 00:27:49.494 } 00:27:49.494 ] 00:27:49.494 }' 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 3634778 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 3634778 ']' 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 3634778 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3634778 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3634778' 00:27:49.494 killing process with pid 3634778 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 3634778 00:27:49.494 Received shutdown signal, test time was about 60.000000 seconds 00:27:49.494 00:27:49.494 Latency(us) 00:27:49.494 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:49.494 =================================================================================================================== 00:27:49.494 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:49.494 [2024-07-16 00:22:36.419894] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:49.494 [2024-07-16 00:22:36.419995] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:49.494 [2024-07-16 00:22:36.420038] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:49.494 [2024-07-16 00:22:36.420052] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc28260 name raid_bdev1, state offline 00:27:49.494 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 3634778 00:27:49.752 [2024-07-16 00:22:36.447498] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:49.752 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:27:49.752 00:27:49.752 real 0m31.437s 00:27:49.752 user 0m48.934s 00:27:49.752 sys 0m5.166s 00:27:49.753 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:49.753 00:22:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:49.753 ************************************ 00:27:49.753 END TEST raid_rebuild_test_sb_4k 00:27:49.753 ************************************ 00:27:50.010 00:22:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:50.010 00:22:36 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:27:50.010 00:22:36 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:27:50.010 00:22:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:50.010 00:22:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:50.010 00:22:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:50.010 ************************************ 00:27:50.010 START TEST raid_state_function_test_sb_md_separate 00:27:50.010 ************************************ 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=3639268 00:27:50.010 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3639268' 00:27:50.010 Process raid pid: 3639268 00:27:50.011 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:50.011 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 3639268 /var/tmp/spdk-raid.sock 00:27:50.011 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 3639268 ']' 00:27:50.011 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:50.011 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:50.011 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:50.011 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:50.011 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:50.011 00:22:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:50.011 [2024-07-16 00:22:36.830494] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:27:50.011 [2024-07-16 00:22:36.830568] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:50.268 [2024-07-16 00:22:36.962703] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:50.268 [2024-07-16 00:22:37.068961] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:50.268 [2024-07-16 00:22:37.131086] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:50.268 [2024-07-16 00:22:37.131114] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:50.834 00:22:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:50.834 00:22:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:50.834 00:22:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:51.092 [2024-07-16 00:22:37.993861] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:51.092 [2024-07-16 00:22:37.993905] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:51.092 [2024-07-16 00:22:37.993916] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:51.092 [2024-07-16 00:22:37.993935] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.092 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:51.350 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:51.350 "name": "Existed_Raid", 00:27:51.350 "uuid": "226b45cc-875d-4d07-80b6-cb50a3b0aba8", 00:27:51.350 "strip_size_kb": 0, 00:27:51.350 "state": "configuring", 00:27:51.350 "raid_level": "raid1", 00:27:51.350 "superblock": true, 00:27:51.350 "num_base_bdevs": 2, 00:27:51.350 "num_base_bdevs_discovered": 0, 00:27:51.350 "num_base_bdevs_operational": 2, 00:27:51.351 "base_bdevs_list": [ 00:27:51.351 { 00:27:51.351 "name": "BaseBdev1", 00:27:51.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:51.351 "is_configured": false, 00:27:51.351 "data_offset": 0, 00:27:51.351 "data_size": 0 00:27:51.351 }, 00:27:51.351 { 00:27:51.351 "name": "BaseBdev2", 00:27:51.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:51.351 "is_configured": false, 00:27:51.351 "data_offset": 0, 00:27:51.351 "data_size": 0 00:27:51.351 } 00:27:51.351 ] 00:27:51.351 }' 00:27:51.351 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:51.351 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:52.284 00:22:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:52.284 [2024-07-16 00:22:39.032471] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:52.284 [2024-07-16 00:22:39.032508] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x165fa80 name Existed_Raid, state configuring 00:27:52.284 00:22:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:52.543 [2024-07-16 00:22:39.297183] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:52.543 [2024-07-16 00:22:39.297218] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:52.543 [2024-07-16 00:22:39.297228] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:52.543 [2024-07-16 00:22:39.297240] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:52.543 00:22:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:27:52.801 [2024-07-16 00:22:39.568491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:52.801 BaseBdev1 00:27:52.801 00:22:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:52.801 00:22:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:52.801 00:22:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:52.801 00:22:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:52.801 00:22:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:52.801 00:22:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:52.801 00:22:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:53.059 00:22:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:53.317 [ 00:27:53.317 { 00:27:53.317 "name": "BaseBdev1", 00:27:53.317 "aliases": [ 00:27:53.317 "84a91f16-e32e-4b02-a612-72db562efdbe" 00:27:53.317 ], 00:27:53.317 "product_name": "Malloc disk", 00:27:53.317 "block_size": 4096, 00:27:53.317 "num_blocks": 8192, 00:27:53.317 "uuid": "84a91f16-e32e-4b02-a612-72db562efdbe", 00:27:53.317 "md_size": 32, 00:27:53.317 "md_interleave": false, 00:27:53.317 "dif_type": 0, 00:27:53.317 "assigned_rate_limits": { 00:27:53.317 "rw_ios_per_sec": 0, 00:27:53.317 "rw_mbytes_per_sec": 0, 00:27:53.317 "r_mbytes_per_sec": 0, 00:27:53.317 "w_mbytes_per_sec": 0 00:27:53.317 }, 00:27:53.317 "claimed": true, 00:27:53.317 "claim_type": "exclusive_write", 00:27:53.317 "zoned": false, 00:27:53.317 "supported_io_types": { 00:27:53.317 "read": true, 00:27:53.317 "write": true, 00:27:53.317 "unmap": true, 00:27:53.317 "flush": true, 00:27:53.317 "reset": true, 00:27:53.317 "nvme_admin": false, 00:27:53.317 "nvme_io": false, 00:27:53.317 "nvme_io_md": false, 00:27:53.317 "write_zeroes": true, 00:27:53.317 "zcopy": true, 00:27:53.317 "get_zone_info": false, 00:27:53.317 "zone_management": false, 00:27:53.317 "zone_append": false, 00:27:53.317 "compare": false, 00:27:53.317 "compare_and_write": false, 00:27:53.317 "abort": true, 00:27:53.317 "seek_hole": false, 00:27:53.317 "seek_data": false, 00:27:53.317 "copy": true, 00:27:53.317 "nvme_iov_md": false 00:27:53.317 }, 00:27:53.317 "memory_domains": [ 00:27:53.317 { 00:27:53.317 "dma_device_id": "system", 00:27:53.317 "dma_device_type": 1 00:27:53.317 }, 00:27:53.317 { 00:27:53.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:53.317 "dma_device_type": 2 00:27:53.317 } 00:27:53.317 ], 00:27:53.317 "driver_specific": {} 00:27:53.317 } 00:27:53.317 ] 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.317 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:53.575 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:53.575 "name": "Existed_Raid", 00:27:53.575 "uuid": "93b0f251-9358-4503-bdd4-428c1a4daa02", 00:27:53.575 "strip_size_kb": 0, 00:27:53.575 "state": "configuring", 00:27:53.575 "raid_level": "raid1", 00:27:53.575 "superblock": true, 00:27:53.575 "num_base_bdevs": 2, 00:27:53.575 "num_base_bdevs_discovered": 1, 00:27:53.575 "num_base_bdevs_operational": 2, 00:27:53.575 "base_bdevs_list": [ 00:27:53.575 { 00:27:53.575 "name": "BaseBdev1", 00:27:53.575 "uuid": "84a91f16-e32e-4b02-a612-72db562efdbe", 00:27:53.575 "is_configured": true, 00:27:53.575 "data_offset": 256, 00:27:53.575 "data_size": 7936 00:27:53.575 }, 00:27:53.575 { 00:27:53.575 "name": "BaseBdev2", 00:27:53.575 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:53.575 "is_configured": false, 00:27:53.575 "data_offset": 0, 00:27:53.575 "data_size": 0 00:27:53.575 } 00:27:53.575 ] 00:27:53.575 }' 00:27:53.575 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:53.575 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:54.141 00:22:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:54.398 [2024-07-16 00:22:41.172795] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:54.398 [2024-07-16 00:22:41.172839] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x165f350 name Existed_Raid, state configuring 00:27:54.398 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:54.655 [2024-07-16 00:22:41.437533] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:54.655 [2024-07-16 00:22:41.438994] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:54.655 [2024-07-16 00:22:41.439028] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:54.655 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:54.655 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:54.655 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:54.655 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:54.655 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:54.655 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.656 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.656 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:54.656 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.656 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.656 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.656 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.656 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.656 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:54.913 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.913 "name": "Existed_Raid", 00:27:54.913 "uuid": "f612369b-6c4c-49d4-8939-da669526c686", 00:27:54.913 "strip_size_kb": 0, 00:27:54.913 "state": "configuring", 00:27:54.913 "raid_level": "raid1", 00:27:54.913 "superblock": true, 00:27:54.913 "num_base_bdevs": 2, 00:27:54.913 "num_base_bdevs_discovered": 1, 00:27:54.913 "num_base_bdevs_operational": 2, 00:27:54.913 "base_bdevs_list": [ 00:27:54.913 { 00:27:54.913 "name": "BaseBdev1", 00:27:54.913 "uuid": "84a91f16-e32e-4b02-a612-72db562efdbe", 00:27:54.913 "is_configured": true, 00:27:54.913 "data_offset": 256, 00:27:54.913 "data_size": 7936 00:27:54.913 }, 00:27:54.913 { 00:27:54.913 "name": "BaseBdev2", 00:27:54.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.913 "is_configured": false, 00:27:54.913 "data_offset": 0, 00:27:54.913 "data_size": 0 00:27:54.913 } 00:27:54.913 ] 00:27:54.913 }' 00:27:54.913 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.913 00:22:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:55.479 00:22:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:27:55.738 [2024-07-16 00:22:42.508513] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:55.738 [2024-07-16 00:22:42.508667] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1661210 00:27:55.738 [2024-07-16 00:22:42.508681] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:55.738 [2024-07-16 00:22:42.508746] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1660c50 00:27:55.738 [2024-07-16 00:22:42.508844] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1661210 00:27:55.738 [2024-07-16 00:22:42.508854] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1661210 00:27:55.738 [2024-07-16 00:22:42.508935] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:55.738 BaseBdev2 00:27:55.739 00:22:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:55.739 00:22:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:55.739 00:22:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:55.739 00:22:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:55.739 00:22:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:55.739 00:22:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:55.739 00:22:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:55.996 00:22:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:56.254 [ 00:27:56.254 { 00:27:56.254 "name": "BaseBdev2", 00:27:56.254 "aliases": [ 00:27:56.254 "d52e0a9d-055e-48ba-8feb-58711a26836d" 00:27:56.254 ], 00:27:56.254 "product_name": "Malloc disk", 00:27:56.254 "block_size": 4096, 00:27:56.254 "num_blocks": 8192, 00:27:56.254 "uuid": "d52e0a9d-055e-48ba-8feb-58711a26836d", 00:27:56.254 "md_size": 32, 00:27:56.254 "md_interleave": false, 00:27:56.254 "dif_type": 0, 00:27:56.254 "assigned_rate_limits": { 00:27:56.254 "rw_ios_per_sec": 0, 00:27:56.254 "rw_mbytes_per_sec": 0, 00:27:56.254 "r_mbytes_per_sec": 0, 00:27:56.254 "w_mbytes_per_sec": 0 00:27:56.254 }, 00:27:56.254 "claimed": true, 00:27:56.254 "claim_type": "exclusive_write", 00:27:56.254 "zoned": false, 00:27:56.254 "supported_io_types": { 00:27:56.254 "read": true, 00:27:56.254 "write": true, 00:27:56.254 "unmap": true, 00:27:56.254 "flush": true, 00:27:56.254 "reset": true, 00:27:56.254 "nvme_admin": false, 00:27:56.254 "nvme_io": false, 00:27:56.254 "nvme_io_md": false, 00:27:56.254 "write_zeroes": true, 00:27:56.254 "zcopy": true, 00:27:56.254 "get_zone_info": false, 00:27:56.254 "zone_management": false, 00:27:56.254 "zone_append": false, 00:27:56.254 "compare": false, 00:27:56.254 "compare_and_write": false, 00:27:56.254 "abort": true, 00:27:56.254 "seek_hole": false, 00:27:56.254 "seek_data": false, 00:27:56.254 "copy": true, 00:27:56.254 "nvme_iov_md": false 00:27:56.254 }, 00:27:56.254 "memory_domains": [ 00:27:56.254 { 00:27:56.254 "dma_device_id": "system", 00:27:56.254 "dma_device_type": 1 00:27:56.254 }, 00:27:56.254 { 00:27:56.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:56.254 "dma_device_type": 2 00:27:56.254 } 00:27:56.254 ], 00:27:56.254 "driver_specific": {} 00:27:56.254 } 00:27:56.254 ] 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.254 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:56.521 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:56.521 "name": "Existed_Raid", 00:27:56.521 "uuid": "f612369b-6c4c-49d4-8939-da669526c686", 00:27:56.521 "strip_size_kb": 0, 00:27:56.521 "state": "online", 00:27:56.521 "raid_level": "raid1", 00:27:56.521 "superblock": true, 00:27:56.521 "num_base_bdevs": 2, 00:27:56.521 "num_base_bdevs_discovered": 2, 00:27:56.521 "num_base_bdevs_operational": 2, 00:27:56.521 "base_bdevs_list": [ 00:27:56.521 { 00:27:56.521 "name": "BaseBdev1", 00:27:56.521 "uuid": "84a91f16-e32e-4b02-a612-72db562efdbe", 00:27:56.521 "is_configured": true, 00:27:56.521 "data_offset": 256, 00:27:56.521 "data_size": 7936 00:27:56.521 }, 00:27:56.521 { 00:27:56.521 "name": "BaseBdev2", 00:27:56.521 "uuid": "d52e0a9d-055e-48ba-8feb-58711a26836d", 00:27:56.521 "is_configured": true, 00:27:56.521 "data_offset": 256, 00:27:56.521 "data_size": 7936 00:27:56.521 } 00:27:56.521 ] 00:27:56.521 }' 00:27:56.521 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:56.521 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:57.093 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:57.094 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:57.094 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:57.094 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:57.094 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:57.094 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:57.094 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:57.094 00:22:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:57.352 [2024-07-16 00:22:44.117250] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:57.352 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:57.352 "name": "Existed_Raid", 00:27:57.352 "aliases": [ 00:27:57.352 "f612369b-6c4c-49d4-8939-da669526c686" 00:27:57.352 ], 00:27:57.352 "product_name": "Raid Volume", 00:27:57.352 "block_size": 4096, 00:27:57.352 "num_blocks": 7936, 00:27:57.352 "uuid": "f612369b-6c4c-49d4-8939-da669526c686", 00:27:57.352 "md_size": 32, 00:27:57.352 "md_interleave": false, 00:27:57.352 "dif_type": 0, 00:27:57.352 "assigned_rate_limits": { 00:27:57.352 "rw_ios_per_sec": 0, 00:27:57.352 "rw_mbytes_per_sec": 0, 00:27:57.352 "r_mbytes_per_sec": 0, 00:27:57.352 "w_mbytes_per_sec": 0 00:27:57.352 }, 00:27:57.352 "claimed": false, 00:27:57.352 "zoned": false, 00:27:57.352 "supported_io_types": { 00:27:57.352 "read": true, 00:27:57.352 "write": true, 00:27:57.352 "unmap": false, 00:27:57.352 "flush": false, 00:27:57.352 "reset": true, 00:27:57.352 "nvme_admin": false, 00:27:57.352 "nvme_io": false, 00:27:57.352 "nvme_io_md": false, 00:27:57.352 "write_zeroes": true, 00:27:57.352 "zcopy": false, 00:27:57.352 "get_zone_info": false, 00:27:57.352 "zone_management": false, 00:27:57.352 "zone_append": false, 00:27:57.352 "compare": false, 00:27:57.352 "compare_and_write": false, 00:27:57.352 "abort": false, 00:27:57.352 "seek_hole": false, 00:27:57.352 "seek_data": false, 00:27:57.352 "copy": false, 00:27:57.352 "nvme_iov_md": false 00:27:57.352 }, 00:27:57.352 "memory_domains": [ 00:27:57.352 { 00:27:57.352 "dma_device_id": "system", 00:27:57.352 "dma_device_type": 1 00:27:57.352 }, 00:27:57.352 { 00:27:57.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:57.352 "dma_device_type": 2 00:27:57.352 }, 00:27:57.352 { 00:27:57.352 "dma_device_id": "system", 00:27:57.352 "dma_device_type": 1 00:27:57.352 }, 00:27:57.352 { 00:27:57.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:57.352 "dma_device_type": 2 00:27:57.352 } 00:27:57.352 ], 00:27:57.352 "driver_specific": { 00:27:57.352 "raid": { 00:27:57.352 "uuid": "f612369b-6c4c-49d4-8939-da669526c686", 00:27:57.352 "strip_size_kb": 0, 00:27:57.352 "state": "online", 00:27:57.352 "raid_level": "raid1", 00:27:57.352 "superblock": true, 00:27:57.352 "num_base_bdevs": 2, 00:27:57.352 "num_base_bdevs_discovered": 2, 00:27:57.352 "num_base_bdevs_operational": 2, 00:27:57.352 "base_bdevs_list": [ 00:27:57.352 { 00:27:57.352 "name": "BaseBdev1", 00:27:57.352 "uuid": "84a91f16-e32e-4b02-a612-72db562efdbe", 00:27:57.352 "is_configured": true, 00:27:57.352 "data_offset": 256, 00:27:57.352 "data_size": 7936 00:27:57.352 }, 00:27:57.352 { 00:27:57.352 "name": "BaseBdev2", 00:27:57.352 "uuid": "d52e0a9d-055e-48ba-8feb-58711a26836d", 00:27:57.352 "is_configured": true, 00:27:57.352 "data_offset": 256, 00:27:57.352 "data_size": 7936 00:27:57.352 } 00:27:57.352 ] 00:27:57.352 } 00:27:57.352 } 00:27:57.352 }' 00:27:57.352 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:57.352 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:57.352 BaseBdev2' 00:27:57.352 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:57.352 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:57.352 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:57.611 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:57.611 "name": "BaseBdev1", 00:27:57.611 "aliases": [ 00:27:57.611 "84a91f16-e32e-4b02-a612-72db562efdbe" 00:27:57.611 ], 00:27:57.611 "product_name": "Malloc disk", 00:27:57.611 "block_size": 4096, 00:27:57.611 "num_blocks": 8192, 00:27:57.611 "uuid": "84a91f16-e32e-4b02-a612-72db562efdbe", 00:27:57.611 "md_size": 32, 00:27:57.611 "md_interleave": false, 00:27:57.611 "dif_type": 0, 00:27:57.611 "assigned_rate_limits": { 00:27:57.611 "rw_ios_per_sec": 0, 00:27:57.611 "rw_mbytes_per_sec": 0, 00:27:57.611 "r_mbytes_per_sec": 0, 00:27:57.611 "w_mbytes_per_sec": 0 00:27:57.611 }, 00:27:57.611 "claimed": true, 00:27:57.611 "claim_type": "exclusive_write", 00:27:57.611 "zoned": false, 00:27:57.611 "supported_io_types": { 00:27:57.611 "read": true, 00:27:57.611 "write": true, 00:27:57.611 "unmap": true, 00:27:57.611 "flush": true, 00:27:57.611 "reset": true, 00:27:57.611 "nvme_admin": false, 00:27:57.611 "nvme_io": false, 00:27:57.611 "nvme_io_md": false, 00:27:57.611 "write_zeroes": true, 00:27:57.611 "zcopy": true, 00:27:57.611 "get_zone_info": false, 00:27:57.611 "zone_management": false, 00:27:57.611 "zone_append": false, 00:27:57.611 "compare": false, 00:27:57.611 "compare_and_write": false, 00:27:57.611 "abort": true, 00:27:57.611 "seek_hole": false, 00:27:57.611 "seek_data": false, 00:27:57.611 "copy": true, 00:27:57.611 "nvme_iov_md": false 00:27:57.611 }, 00:27:57.611 "memory_domains": [ 00:27:57.611 { 00:27:57.611 "dma_device_id": "system", 00:27:57.611 "dma_device_type": 1 00:27:57.611 }, 00:27:57.611 { 00:27:57.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:57.611 "dma_device_type": 2 00:27:57.611 } 00:27:57.611 ], 00:27:57.611 "driver_specific": {} 00:27:57.611 }' 00:27:57.611 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:57.611 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:57.611 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:57.611 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:57.869 00:22:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:58.128 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:58.128 "name": "BaseBdev2", 00:27:58.128 "aliases": [ 00:27:58.128 "d52e0a9d-055e-48ba-8feb-58711a26836d" 00:27:58.128 ], 00:27:58.128 "product_name": "Malloc disk", 00:27:58.128 "block_size": 4096, 00:27:58.128 "num_blocks": 8192, 00:27:58.128 "uuid": "d52e0a9d-055e-48ba-8feb-58711a26836d", 00:27:58.128 "md_size": 32, 00:27:58.128 "md_interleave": false, 00:27:58.128 "dif_type": 0, 00:27:58.128 "assigned_rate_limits": { 00:27:58.128 "rw_ios_per_sec": 0, 00:27:58.128 "rw_mbytes_per_sec": 0, 00:27:58.128 "r_mbytes_per_sec": 0, 00:27:58.128 "w_mbytes_per_sec": 0 00:27:58.128 }, 00:27:58.128 "claimed": true, 00:27:58.128 "claim_type": "exclusive_write", 00:27:58.128 "zoned": false, 00:27:58.128 "supported_io_types": { 00:27:58.128 "read": true, 00:27:58.128 "write": true, 00:27:58.128 "unmap": true, 00:27:58.128 "flush": true, 00:27:58.128 "reset": true, 00:27:58.128 "nvme_admin": false, 00:27:58.128 "nvme_io": false, 00:27:58.128 "nvme_io_md": false, 00:27:58.128 "write_zeroes": true, 00:27:58.128 "zcopy": true, 00:27:58.128 "get_zone_info": false, 00:27:58.128 "zone_management": false, 00:27:58.128 "zone_append": false, 00:27:58.128 "compare": false, 00:27:58.128 "compare_and_write": false, 00:27:58.128 "abort": true, 00:27:58.128 "seek_hole": false, 00:27:58.128 "seek_data": false, 00:27:58.128 "copy": true, 00:27:58.128 "nvme_iov_md": false 00:27:58.128 }, 00:27:58.128 "memory_domains": [ 00:27:58.128 { 00:27:58.128 "dma_device_id": "system", 00:27:58.128 "dma_device_type": 1 00:27:58.128 }, 00:27:58.128 { 00:27:58.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:58.128 "dma_device_type": 2 00:27:58.128 } 00:27:58.128 ], 00:27:58.128 "driver_specific": {} 00:27:58.128 }' 00:27:58.128 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:58.387 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:58.387 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:58.387 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:58.387 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:58.387 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:58.387 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:58.387 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:58.387 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:58.387 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:58.646 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:58.646 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:58.646 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:58.905 [2024-07-16 00:22:45.649086] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:58.905 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.163 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:59.163 "name": "Existed_Raid", 00:27:59.163 "uuid": "f612369b-6c4c-49d4-8939-da669526c686", 00:27:59.163 "strip_size_kb": 0, 00:27:59.163 "state": "online", 00:27:59.163 "raid_level": "raid1", 00:27:59.163 "superblock": true, 00:27:59.163 "num_base_bdevs": 2, 00:27:59.163 "num_base_bdevs_discovered": 1, 00:27:59.163 "num_base_bdevs_operational": 1, 00:27:59.163 "base_bdevs_list": [ 00:27:59.163 { 00:27:59.163 "name": null, 00:27:59.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:59.163 "is_configured": false, 00:27:59.163 "data_offset": 256, 00:27:59.163 "data_size": 7936 00:27:59.163 }, 00:27:59.163 { 00:27:59.163 "name": "BaseBdev2", 00:27:59.163 "uuid": "d52e0a9d-055e-48ba-8feb-58711a26836d", 00:27:59.163 "is_configured": true, 00:27:59.163 "data_offset": 256, 00:27:59.163 "data_size": 7936 00:27:59.163 } 00:27:59.163 ] 00:27:59.163 }' 00:27:59.163 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:59.163 00:22:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:59.731 00:22:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:59.731 00:22:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:59.731 00:22:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.731 00:22:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:59.990 00:22:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:59.990 00:22:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:59.990 00:22:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:00.248 [2024-07-16 00:22:46.993301] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:00.248 [2024-07-16 00:22:46.993392] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:00.248 [2024-07-16 00:22:47.006914] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:00.248 [2024-07-16 00:22:47.006961] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:00.248 [2024-07-16 00:22:47.006973] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1661210 name Existed_Raid, state offline 00:28:00.248 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:00.248 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:00.248 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.248 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:00.506 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:00.506 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:00.506 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:00.506 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 3639268 00:28:00.506 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 3639268 ']' 00:28:00.506 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 3639268 00:28:00.506 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:00.507 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:00.507 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3639268 00:28:00.507 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:00.507 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:00.507 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3639268' 00:28:00.507 killing process with pid 3639268 00:28:00.507 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 3639268 00:28:00.507 [2024-07-16 00:22:47.323836] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:00.507 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 3639268 00:28:00.507 [2024-07-16 00:22:47.324837] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:00.765 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:28:00.765 00:28:00.765 real 0m10.794s 00:28:00.765 user 0m19.093s 00:28:00.765 sys 0m2.076s 00:28:00.765 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:00.765 00:22:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:00.765 ************************************ 00:28:00.765 END TEST raid_state_function_test_sb_md_separate 00:28:00.765 ************************************ 00:28:00.765 00:22:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:00.765 00:22:47 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:28:00.765 00:22:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:00.765 00:22:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:00.765 00:22:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:00.766 ************************************ 00:28:00.766 START TEST raid_superblock_test_md_separate 00:28:00.766 ************************************ 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=3640890 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 3640890 /var/tmp/spdk-raid.sock 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 3640890 ']' 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:00.766 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:00.766 00:22:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:00.766 [2024-07-16 00:22:47.703933] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:28:00.766 [2024-07-16 00:22:47.704001] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3640890 ] 00:28:01.025 [2024-07-16 00:22:47.832150] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:01.025 [2024-07-16 00:22:47.937388] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:01.284 [2024-07-16 00:22:48.002034] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:01.284 [2024-07-16 00:22:48.002077] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:01.850 00:22:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:28:02.108 malloc1 00:28:02.108 00:22:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:02.366 [2024-07-16 00:22:49.123364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:02.366 [2024-07-16 00:22:49.123412] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:02.366 [2024-07-16 00:22:49.123439] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf5d830 00:28:02.366 [2024-07-16 00:22:49.123453] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:02.366 [2024-07-16 00:22:49.125041] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:02.366 [2024-07-16 00:22:49.125068] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:02.366 pt1 00:28:02.366 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:02.366 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:02.366 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:02.366 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:02.366 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:02.366 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:02.366 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:02.366 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:02.366 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:28:02.636 malloc2 00:28:02.636 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:02.926 [2024-07-16 00:22:49.619598] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:02.926 [2024-07-16 00:22:49.619649] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:02.926 [2024-07-16 00:22:49.619669] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf4f250 00:28:02.926 [2024-07-16 00:22:49.619682] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:02.926 [2024-07-16 00:22:49.621206] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:02.926 [2024-07-16 00:22:49.621235] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:02.926 pt2 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:02.926 [2024-07-16 00:22:49.852224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:02.926 [2024-07-16 00:22:49.853612] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:02.926 [2024-07-16 00:22:49.853760] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf4fd20 00:28:02.926 [2024-07-16 00:22:49.853773] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:02.926 [2024-07-16 00:22:49.853848] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf43a60 00:28:02.926 [2024-07-16 00:22:49.853978] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf4fd20 00:28:02.926 [2024-07-16 00:22:49.853989] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf4fd20 00:28:02.926 [2024-07-16 00:22:49.854062] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.926 00:22:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.185 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:03.185 "name": "raid_bdev1", 00:28:03.185 "uuid": "ac1574a2-98d5-460b-8a08-7ef71450329c", 00:28:03.185 "strip_size_kb": 0, 00:28:03.185 "state": "online", 00:28:03.185 "raid_level": "raid1", 00:28:03.185 "superblock": true, 00:28:03.185 "num_base_bdevs": 2, 00:28:03.185 "num_base_bdevs_discovered": 2, 00:28:03.185 "num_base_bdevs_operational": 2, 00:28:03.185 "base_bdevs_list": [ 00:28:03.185 { 00:28:03.185 "name": "pt1", 00:28:03.185 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:03.185 "is_configured": true, 00:28:03.185 "data_offset": 256, 00:28:03.185 "data_size": 7936 00:28:03.185 }, 00:28:03.185 { 00:28:03.185 "name": "pt2", 00:28:03.185 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:03.185 "is_configured": true, 00:28:03.185 "data_offset": 256, 00:28:03.185 "data_size": 7936 00:28:03.185 } 00:28:03.185 ] 00:28:03.185 }' 00:28:03.444 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:03.444 00:22:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:04.010 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:04.010 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:04.010 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:04.010 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:04.010 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:04.010 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:04.010 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:04.010 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:04.269 [2024-07-16 00:22:50.979467] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:04.269 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:04.269 "name": "raid_bdev1", 00:28:04.269 "aliases": [ 00:28:04.269 "ac1574a2-98d5-460b-8a08-7ef71450329c" 00:28:04.269 ], 00:28:04.269 "product_name": "Raid Volume", 00:28:04.269 "block_size": 4096, 00:28:04.269 "num_blocks": 7936, 00:28:04.269 "uuid": "ac1574a2-98d5-460b-8a08-7ef71450329c", 00:28:04.269 "md_size": 32, 00:28:04.269 "md_interleave": false, 00:28:04.269 "dif_type": 0, 00:28:04.269 "assigned_rate_limits": { 00:28:04.269 "rw_ios_per_sec": 0, 00:28:04.269 "rw_mbytes_per_sec": 0, 00:28:04.269 "r_mbytes_per_sec": 0, 00:28:04.269 "w_mbytes_per_sec": 0 00:28:04.269 }, 00:28:04.269 "claimed": false, 00:28:04.269 "zoned": false, 00:28:04.269 "supported_io_types": { 00:28:04.269 "read": true, 00:28:04.269 "write": true, 00:28:04.269 "unmap": false, 00:28:04.269 "flush": false, 00:28:04.269 "reset": true, 00:28:04.269 "nvme_admin": false, 00:28:04.269 "nvme_io": false, 00:28:04.269 "nvme_io_md": false, 00:28:04.269 "write_zeroes": true, 00:28:04.269 "zcopy": false, 00:28:04.269 "get_zone_info": false, 00:28:04.269 "zone_management": false, 00:28:04.269 "zone_append": false, 00:28:04.269 "compare": false, 00:28:04.269 "compare_and_write": false, 00:28:04.269 "abort": false, 00:28:04.269 "seek_hole": false, 00:28:04.269 "seek_data": false, 00:28:04.269 "copy": false, 00:28:04.269 "nvme_iov_md": false 00:28:04.269 }, 00:28:04.269 "memory_domains": [ 00:28:04.269 { 00:28:04.269 "dma_device_id": "system", 00:28:04.269 "dma_device_type": 1 00:28:04.269 }, 00:28:04.269 { 00:28:04.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:04.269 "dma_device_type": 2 00:28:04.269 }, 00:28:04.269 { 00:28:04.269 "dma_device_id": "system", 00:28:04.269 "dma_device_type": 1 00:28:04.269 }, 00:28:04.269 { 00:28:04.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:04.269 "dma_device_type": 2 00:28:04.269 } 00:28:04.269 ], 00:28:04.269 "driver_specific": { 00:28:04.269 "raid": { 00:28:04.269 "uuid": "ac1574a2-98d5-460b-8a08-7ef71450329c", 00:28:04.269 "strip_size_kb": 0, 00:28:04.269 "state": "online", 00:28:04.269 "raid_level": "raid1", 00:28:04.269 "superblock": true, 00:28:04.269 "num_base_bdevs": 2, 00:28:04.269 "num_base_bdevs_discovered": 2, 00:28:04.269 "num_base_bdevs_operational": 2, 00:28:04.269 "base_bdevs_list": [ 00:28:04.269 { 00:28:04.269 "name": "pt1", 00:28:04.269 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:04.269 "is_configured": true, 00:28:04.269 "data_offset": 256, 00:28:04.269 "data_size": 7936 00:28:04.269 }, 00:28:04.269 { 00:28:04.269 "name": "pt2", 00:28:04.269 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:04.269 "is_configured": true, 00:28:04.269 "data_offset": 256, 00:28:04.269 "data_size": 7936 00:28:04.269 } 00:28:04.269 ] 00:28:04.269 } 00:28:04.269 } 00:28:04.269 }' 00:28:04.269 00:22:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:04.269 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:04.269 pt2' 00:28:04.269 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:04.269 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:04.269 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:04.528 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:04.528 "name": "pt1", 00:28:04.528 "aliases": [ 00:28:04.528 "00000000-0000-0000-0000-000000000001" 00:28:04.528 ], 00:28:04.528 "product_name": "passthru", 00:28:04.528 "block_size": 4096, 00:28:04.528 "num_blocks": 8192, 00:28:04.528 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:04.528 "md_size": 32, 00:28:04.528 "md_interleave": false, 00:28:04.528 "dif_type": 0, 00:28:04.528 "assigned_rate_limits": { 00:28:04.528 "rw_ios_per_sec": 0, 00:28:04.528 "rw_mbytes_per_sec": 0, 00:28:04.528 "r_mbytes_per_sec": 0, 00:28:04.528 "w_mbytes_per_sec": 0 00:28:04.528 }, 00:28:04.528 "claimed": true, 00:28:04.528 "claim_type": "exclusive_write", 00:28:04.528 "zoned": false, 00:28:04.528 "supported_io_types": { 00:28:04.528 "read": true, 00:28:04.528 "write": true, 00:28:04.528 "unmap": true, 00:28:04.528 "flush": true, 00:28:04.528 "reset": true, 00:28:04.528 "nvme_admin": false, 00:28:04.528 "nvme_io": false, 00:28:04.528 "nvme_io_md": false, 00:28:04.528 "write_zeroes": true, 00:28:04.528 "zcopy": true, 00:28:04.528 "get_zone_info": false, 00:28:04.528 "zone_management": false, 00:28:04.528 "zone_append": false, 00:28:04.528 "compare": false, 00:28:04.528 "compare_and_write": false, 00:28:04.528 "abort": true, 00:28:04.528 "seek_hole": false, 00:28:04.528 "seek_data": false, 00:28:04.528 "copy": true, 00:28:04.528 "nvme_iov_md": false 00:28:04.528 }, 00:28:04.528 "memory_domains": [ 00:28:04.528 { 00:28:04.528 "dma_device_id": "system", 00:28:04.528 "dma_device_type": 1 00:28:04.528 }, 00:28:04.528 { 00:28:04.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:04.528 "dma_device_type": 2 00:28:04.528 } 00:28:04.528 ], 00:28:04.528 "driver_specific": { 00:28:04.528 "passthru": { 00:28:04.528 "name": "pt1", 00:28:04.528 "base_bdev_name": "malloc1" 00:28:04.528 } 00:28:04.528 } 00:28:04.528 }' 00:28:04.528 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.528 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.528 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:04.528 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.528 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.528 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:04.528 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.786 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.786 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:04.786 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:04.786 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:04.786 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:04.786 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:04.786 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:04.786 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:05.044 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:05.044 "name": "pt2", 00:28:05.044 "aliases": [ 00:28:05.045 "00000000-0000-0000-0000-000000000002" 00:28:05.045 ], 00:28:05.045 "product_name": "passthru", 00:28:05.045 "block_size": 4096, 00:28:05.045 "num_blocks": 8192, 00:28:05.045 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:05.045 "md_size": 32, 00:28:05.045 "md_interleave": false, 00:28:05.045 "dif_type": 0, 00:28:05.045 "assigned_rate_limits": { 00:28:05.045 "rw_ios_per_sec": 0, 00:28:05.045 "rw_mbytes_per_sec": 0, 00:28:05.045 "r_mbytes_per_sec": 0, 00:28:05.045 "w_mbytes_per_sec": 0 00:28:05.045 }, 00:28:05.045 "claimed": true, 00:28:05.045 "claim_type": "exclusive_write", 00:28:05.045 "zoned": false, 00:28:05.045 "supported_io_types": { 00:28:05.045 "read": true, 00:28:05.045 "write": true, 00:28:05.045 "unmap": true, 00:28:05.045 "flush": true, 00:28:05.045 "reset": true, 00:28:05.045 "nvme_admin": false, 00:28:05.045 "nvme_io": false, 00:28:05.045 "nvme_io_md": false, 00:28:05.045 "write_zeroes": true, 00:28:05.045 "zcopy": true, 00:28:05.045 "get_zone_info": false, 00:28:05.045 "zone_management": false, 00:28:05.045 "zone_append": false, 00:28:05.045 "compare": false, 00:28:05.045 "compare_and_write": false, 00:28:05.045 "abort": true, 00:28:05.045 "seek_hole": false, 00:28:05.045 "seek_data": false, 00:28:05.045 "copy": true, 00:28:05.045 "nvme_iov_md": false 00:28:05.045 }, 00:28:05.045 "memory_domains": [ 00:28:05.045 { 00:28:05.045 "dma_device_id": "system", 00:28:05.045 "dma_device_type": 1 00:28:05.045 }, 00:28:05.045 { 00:28:05.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:05.045 "dma_device_type": 2 00:28:05.045 } 00:28:05.045 ], 00:28:05.045 "driver_specific": { 00:28:05.045 "passthru": { 00:28:05.045 "name": "pt2", 00:28:05.045 "base_bdev_name": "malloc2" 00:28:05.045 } 00:28:05.045 } 00:28:05.045 }' 00:28:05.045 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:05.045 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:05.045 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:05.045 00:22:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:05.303 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:05.303 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:05.303 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:05.303 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:05.303 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:05.303 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:05.303 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:05.562 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:05.562 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:05.562 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:05.562 [2024-07-16 00:22:52.487453] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:05.562 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ac1574a2-98d5-460b-8a08-7ef71450329c 00:28:05.562 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z ac1574a2-98d5-460b-8a08-7ef71450329c ']' 00:28:05.562 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:05.820 [2024-07-16 00:22:52.723824] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:05.820 [2024-07-16 00:22:52.723848] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:05.820 [2024-07-16 00:22:52.723902] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:05.820 [2024-07-16 00:22:52.723959] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:05.820 [2024-07-16 00:22:52.723971] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf4fd20 name raid_bdev1, state offline 00:28:05.820 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:05.820 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.078 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:06.078 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:06.078 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:06.078 00:22:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:06.336 00:22:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:06.336 00:22:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:06.593 00:22:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:06.593 00:22:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:06.849 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:07.106 [2024-07-16 00:22:53.951020] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:07.106 [2024-07-16 00:22:53.952402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:07.106 [2024-07-16 00:22:53.952458] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:07.106 [2024-07-16 00:22:53.952498] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:07.106 [2024-07-16 00:22:53.952517] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:07.106 [2024-07-16 00:22:53.952526] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdbfed0 name raid_bdev1, state configuring 00:28:07.106 request: 00:28:07.106 { 00:28:07.106 "name": "raid_bdev1", 00:28:07.106 "raid_level": "raid1", 00:28:07.106 "base_bdevs": [ 00:28:07.106 "malloc1", 00:28:07.106 "malloc2" 00:28:07.106 ], 00:28:07.106 "superblock": false, 00:28:07.106 "method": "bdev_raid_create", 00:28:07.106 "req_id": 1 00:28:07.106 } 00:28:07.106 Got JSON-RPC error response 00:28:07.106 response: 00:28:07.106 { 00:28:07.106 "code": -17, 00:28:07.106 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:07.106 } 00:28:07.106 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:07.106 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:07.106 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:07.106 00:22:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:07.106 00:22:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.106 00:22:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:07.368 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:07.368 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:07.368 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:07.625 [2024-07-16 00:22:54.428231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:07.625 [2024-07-16 00:22:54.428275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:07.625 [2024-07-16 00:22:54.428293] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf5dee0 00:28:07.625 [2024-07-16 00:22:54.428306] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:07.625 [2024-07-16 00:22:54.429772] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:07.625 [2024-07-16 00:22:54.429798] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:07.625 [2024-07-16 00:22:54.429844] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:07.625 [2024-07-16 00:22:54.429868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:07.625 pt1 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.625 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.882 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:07.882 "name": "raid_bdev1", 00:28:07.882 "uuid": "ac1574a2-98d5-460b-8a08-7ef71450329c", 00:28:07.882 "strip_size_kb": 0, 00:28:07.882 "state": "configuring", 00:28:07.882 "raid_level": "raid1", 00:28:07.882 "superblock": true, 00:28:07.882 "num_base_bdevs": 2, 00:28:07.882 "num_base_bdevs_discovered": 1, 00:28:07.882 "num_base_bdevs_operational": 2, 00:28:07.882 "base_bdevs_list": [ 00:28:07.882 { 00:28:07.882 "name": "pt1", 00:28:07.882 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:07.882 "is_configured": true, 00:28:07.882 "data_offset": 256, 00:28:07.882 "data_size": 7936 00:28:07.882 }, 00:28:07.882 { 00:28:07.882 "name": null, 00:28:07.882 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:07.882 "is_configured": false, 00:28:07.882 "data_offset": 256, 00:28:07.882 "data_size": 7936 00:28:07.882 } 00:28:07.882 ] 00:28:07.882 }' 00:28:07.882 00:22:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:07.882 00:22:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:08.502 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:08.502 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:08.502 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:08.502 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:08.760 [2024-07-16 00:22:55.575294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:08.760 [2024-07-16 00:22:55.575343] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:08.760 [2024-07-16 00:22:55.575363] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc0490 00:28:08.760 [2024-07-16 00:22:55.575376] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:08.760 [2024-07-16 00:22:55.575566] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:08.760 [2024-07-16 00:22:55.575582] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:08.760 [2024-07-16 00:22:55.575628] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:08.760 [2024-07-16 00:22:55.575646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:08.760 [2024-07-16 00:22:55.575733] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf445d0 00:28:08.760 [2024-07-16 00:22:55.575744] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:08.760 [2024-07-16 00:22:55.575798] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf45800 00:28:08.760 [2024-07-16 00:22:55.575897] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf445d0 00:28:08.760 [2024-07-16 00:22:55.575907] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf445d0 00:28:08.760 [2024-07-16 00:22:55.575984] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:08.760 pt2 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.760 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:09.018 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:09.018 "name": "raid_bdev1", 00:28:09.018 "uuid": "ac1574a2-98d5-460b-8a08-7ef71450329c", 00:28:09.018 "strip_size_kb": 0, 00:28:09.018 "state": "online", 00:28:09.018 "raid_level": "raid1", 00:28:09.018 "superblock": true, 00:28:09.018 "num_base_bdevs": 2, 00:28:09.018 "num_base_bdevs_discovered": 2, 00:28:09.018 "num_base_bdevs_operational": 2, 00:28:09.018 "base_bdevs_list": [ 00:28:09.018 { 00:28:09.018 "name": "pt1", 00:28:09.018 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:09.018 "is_configured": true, 00:28:09.018 "data_offset": 256, 00:28:09.018 "data_size": 7936 00:28:09.018 }, 00:28:09.018 { 00:28:09.018 "name": "pt2", 00:28:09.018 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:09.018 "is_configured": true, 00:28:09.018 "data_offset": 256, 00:28:09.018 "data_size": 7936 00:28:09.018 } 00:28:09.018 ] 00:28:09.018 }' 00:28:09.018 00:22:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:09.018 00:22:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:09.582 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:09.583 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:09.583 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:09.583 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:09.583 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:09.583 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:09.583 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:09.583 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:09.841 [2024-07-16 00:22:56.674459] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:09.841 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:09.841 "name": "raid_bdev1", 00:28:09.841 "aliases": [ 00:28:09.841 "ac1574a2-98d5-460b-8a08-7ef71450329c" 00:28:09.841 ], 00:28:09.841 "product_name": "Raid Volume", 00:28:09.841 "block_size": 4096, 00:28:09.841 "num_blocks": 7936, 00:28:09.841 "uuid": "ac1574a2-98d5-460b-8a08-7ef71450329c", 00:28:09.841 "md_size": 32, 00:28:09.841 "md_interleave": false, 00:28:09.841 "dif_type": 0, 00:28:09.841 "assigned_rate_limits": { 00:28:09.841 "rw_ios_per_sec": 0, 00:28:09.841 "rw_mbytes_per_sec": 0, 00:28:09.841 "r_mbytes_per_sec": 0, 00:28:09.841 "w_mbytes_per_sec": 0 00:28:09.841 }, 00:28:09.841 "claimed": false, 00:28:09.841 "zoned": false, 00:28:09.841 "supported_io_types": { 00:28:09.841 "read": true, 00:28:09.841 "write": true, 00:28:09.841 "unmap": false, 00:28:09.841 "flush": false, 00:28:09.841 "reset": true, 00:28:09.841 "nvme_admin": false, 00:28:09.841 "nvme_io": false, 00:28:09.841 "nvme_io_md": false, 00:28:09.841 "write_zeroes": true, 00:28:09.841 "zcopy": false, 00:28:09.841 "get_zone_info": false, 00:28:09.841 "zone_management": false, 00:28:09.841 "zone_append": false, 00:28:09.841 "compare": false, 00:28:09.841 "compare_and_write": false, 00:28:09.841 "abort": false, 00:28:09.841 "seek_hole": false, 00:28:09.841 "seek_data": false, 00:28:09.841 "copy": false, 00:28:09.841 "nvme_iov_md": false 00:28:09.841 }, 00:28:09.841 "memory_domains": [ 00:28:09.841 { 00:28:09.841 "dma_device_id": "system", 00:28:09.841 "dma_device_type": 1 00:28:09.841 }, 00:28:09.841 { 00:28:09.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.841 "dma_device_type": 2 00:28:09.841 }, 00:28:09.841 { 00:28:09.841 "dma_device_id": "system", 00:28:09.841 "dma_device_type": 1 00:28:09.841 }, 00:28:09.841 { 00:28:09.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.841 "dma_device_type": 2 00:28:09.841 } 00:28:09.841 ], 00:28:09.841 "driver_specific": { 00:28:09.841 "raid": { 00:28:09.841 "uuid": "ac1574a2-98d5-460b-8a08-7ef71450329c", 00:28:09.841 "strip_size_kb": 0, 00:28:09.841 "state": "online", 00:28:09.841 "raid_level": "raid1", 00:28:09.841 "superblock": true, 00:28:09.841 "num_base_bdevs": 2, 00:28:09.841 "num_base_bdevs_discovered": 2, 00:28:09.841 "num_base_bdevs_operational": 2, 00:28:09.841 "base_bdevs_list": [ 00:28:09.841 { 00:28:09.841 "name": "pt1", 00:28:09.841 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:09.841 "is_configured": true, 00:28:09.841 "data_offset": 256, 00:28:09.841 "data_size": 7936 00:28:09.841 }, 00:28:09.841 { 00:28:09.841 "name": "pt2", 00:28:09.841 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:09.841 "is_configured": true, 00:28:09.841 "data_offset": 256, 00:28:09.841 "data_size": 7936 00:28:09.841 } 00:28:09.841 ] 00:28:09.841 } 00:28:09.841 } 00:28:09.841 }' 00:28:09.841 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:09.841 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:09.841 pt2' 00:28:09.841 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:09.841 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:09.841 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:10.099 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:10.099 "name": "pt1", 00:28:10.099 "aliases": [ 00:28:10.099 "00000000-0000-0000-0000-000000000001" 00:28:10.099 ], 00:28:10.099 "product_name": "passthru", 00:28:10.099 "block_size": 4096, 00:28:10.099 "num_blocks": 8192, 00:28:10.099 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:10.099 "md_size": 32, 00:28:10.099 "md_interleave": false, 00:28:10.099 "dif_type": 0, 00:28:10.099 "assigned_rate_limits": { 00:28:10.099 "rw_ios_per_sec": 0, 00:28:10.099 "rw_mbytes_per_sec": 0, 00:28:10.099 "r_mbytes_per_sec": 0, 00:28:10.099 "w_mbytes_per_sec": 0 00:28:10.099 }, 00:28:10.099 "claimed": true, 00:28:10.099 "claim_type": "exclusive_write", 00:28:10.099 "zoned": false, 00:28:10.099 "supported_io_types": { 00:28:10.099 "read": true, 00:28:10.099 "write": true, 00:28:10.100 "unmap": true, 00:28:10.100 "flush": true, 00:28:10.100 "reset": true, 00:28:10.100 "nvme_admin": false, 00:28:10.100 "nvme_io": false, 00:28:10.100 "nvme_io_md": false, 00:28:10.100 "write_zeroes": true, 00:28:10.100 "zcopy": true, 00:28:10.100 "get_zone_info": false, 00:28:10.100 "zone_management": false, 00:28:10.100 "zone_append": false, 00:28:10.100 "compare": false, 00:28:10.100 "compare_and_write": false, 00:28:10.100 "abort": true, 00:28:10.100 "seek_hole": false, 00:28:10.100 "seek_data": false, 00:28:10.100 "copy": true, 00:28:10.100 "nvme_iov_md": false 00:28:10.100 }, 00:28:10.100 "memory_domains": [ 00:28:10.100 { 00:28:10.100 "dma_device_id": "system", 00:28:10.100 "dma_device_type": 1 00:28:10.100 }, 00:28:10.100 { 00:28:10.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:10.100 "dma_device_type": 2 00:28:10.100 } 00:28:10.100 ], 00:28:10.100 "driver_specific": { 00:28:10.100 "passthru": { 00:28:10.100 "name": "pt1", 00:28:10.100 "base_bdev_name": "malloc1" 00:28:10.100 } 00:28:10.100 } 00:28:10.100 }' 00:28:10.100 00:22:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:10.100 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:10.358 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:10.358 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:10.358 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:10.358 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:10.358 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:10.358 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:10.358 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:10.358 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.358 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.616 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:10.616 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:10.616 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:10.616 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:10.874 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:10.874 "name": "pt2", 00:28:10.874 "aliases": [ 00:28:10.874 "00000000-0000-0000-0000-000000000002" 00:28:10.874 ], 00:28:10.874 "product_name": "passthru", 00:28:10.874 "block_size": 4096, 00:28:10.874 "num_blocks": 8192, 00:28:10.874 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:10.874 "md_size": 32, 00:28:10.874 "md_interleave": false, 00:28:10.874 "dif_type": 0, 00:28:10.874 "assigned_rate_limits": { 00:28:10.874 "rw_ios_per_sec": 0, 00:28:10.874 "rw_mbytes_per_sec": 0, 00:28:10.874 "r_mbytes_per_sec": 0, 00:28:10.874 "w_mbytes_per_sec": 0 00:28:10.874 }, 00:28:10.874 "claimed": true, 00:28:10.874 "claim_type": "exclusive_write", 00:28:10.874 "zoned": false, 00:28:10.874 "supported_io_types": { 00:28:10.874 "read": true, 00:28:10.874 "write": true, 00:28:10.874 "unmap": true, 00:28:10.874 "flush": true, 00:28:10.874 "reset": true, 00:28:10.874 "nvme_admin": false, 00:28:10.874 "nvme_io": false, 00:28:10.874 "nvme_io_md": false, 00:28:10.874 "write_zeroes": true, 00:28:10.874 "zcopy": true, 00:28:10.874 "get_zone_info": false, 00:28:10.874 "zone_management": false, 00:28:10.874 "zone_append": false, 00:28:10.874 "compare": false, 00:28:10.874 "compare_and_write": false, 00:28:10.874 "abort": true, 00:28:10.874 "seek_hole": false, 00:28:10.874 "seek_data": false, 00:28:10.874 "copy": true, 00:28:10.874 "nvme_iov_md": false 00:28:10.874 }, 00:28:10.874 "memory_domains": [ 00:28:10.874 { 00:28:10.874 "dma_device_id": "system", 00:28:10.874 "dma_device_type": 1 00:28:10.874 }, 00:28:10.874 { 00:28:10.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:10.874 "dma_device_type": 2 00:28:10.874 } 00:28:10.874 ], 00:28:10.874 "driver_specific": { 00:28:10.874 "passthru": { 00:28:10.874 "name": "pt2", 00:28:10.874 "base_bdev_name": "malloc2" 00:28:10.874 } 00:28:10.874 } 00:28:10.874 }' 00:28:10.874 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:10.874 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:10.874 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:10.874 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:10.874 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:10.874 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:10.874 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:10.874 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:11.132 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:11.132 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:11.132 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:11.132 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:11.132 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:11.132 00:22:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:11.390 [2024-07-16 00:22:58.174456] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:11.390 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' ac1574a2-98d5-460b-8a08-7ef71450329c '!=' ac1574a2-98d5-460b-8a08-7ef71450329c ']' 00:28:11.390 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:11.390 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:11.390 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:11.390 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:11.648 [2024-07-16 00:22:58.422862] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:11.648 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.649 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:11.907 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:11.907 "name": "raid_bdev1", 00:28:11.907 "uuid": "ac1574a2-98d5-460b-8a08-7ef71450329c", 00:28:11.907 "strip_size_kb": 0, 00:28:11.907 "state": "online", 00:28:11.907 "raid_level": "raid1", 00:28:11.907 "superblock": true, 00:28:11.907 "num_base_bdevs": 2, 00:28:11.907 "num_base_bdevs_discovered": 1, 00:28:11.907 "num_base_bdevs_operational": 1, 00:28:11.907 "base_bdevs_list": [ 00:28:11.907 { 00:28:11.907 "name": null, 00:28:11.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:11.907 "is_configured": false, 00:28:11.907 "data_offset": 256, 00:28:11.907 "data_size": 7936 00:28:11.907 }, 00:28:11.907 { 00:28:11.907 "name": "pt2", 00:28:11.907 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:11.907 "is_configured": true, 00:28:11.907 "data_offset": 256, 00:28:11.907 "data_size": 7936 00:28:11.907 } 00:28:11.907 ] 00:28:11.907 }' 00:28:11.907 00:22:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:11.907 00:22:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:12.473 00:22:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:12.731 [2024-07-16 00:22:59.509716] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:12.731 [2024-07-16 00:22:59.509742] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:12.731 [2024-07-16 00:22:59.509792] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:12.731 [2024-07-16 00:22:59.509834] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:12.731 [2024-07-16 00:22:59.509846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf445d0 name raid_bdev1, state offline 00:28:12.731 00:22:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.731 00:22:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:12.989 00:22:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:12.989 00:22:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:12.989 00:22:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:12.989 00:22:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:12.989 00:22:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:13.248 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:13.248 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:13.248 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:13.248 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:13.248 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:28:13.248 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:13.814 [2024-07-16 00:23:00.520350] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:13.814 [2024-07-16 00:23:00.520400] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:13.814 [2024-07-16 00:23:00.520419] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf42660 00:28:13.814 [2024-07-16 00:23:00.520433] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:13.814 [2024-07-16 00:23:00.521917] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:13.814 [2024-07-16 00:23:00.521954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:13.814 [2024-07-16 00:23:00.522003] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:13.814 [2024-07-16 00:23:00.522028] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:13.814 [2024-07-16 00:23:00.522105] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf44d10 00:28:13.814 [2024-07-16 00:23:00.522116] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:13.815 [2024-07-16 00:23:00.522174] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf45560 00:28:13.815 [2024-07-16 00:23:00.522270] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf44d10 00:28:13.815 [2024-07-16 00:23:00.522280] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf44d10 00:28:13.815 [2024-07-16 00:23:00.522345] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:13.815 pt2 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.815 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:14.073 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:14.073 "name": "raid_bdev1", 00:28:14.073 "uuid": "ac1574a2-98d5-460b-8a08-7ef71450329c", 00:28:14.073 "strip_size_kb": 0, 00:28:14.073 "state": "online", 00:28:14.073 "raid_level": "raid1", 00:28:14.073 "superblock": true, 00:28:14.073 "num_base_bdevs": 2, 00:28:14.073 "num_base_bdevs_discovered": 1, 00:28:14.073 "num_base_bdevs_operational": 1, 00:28:14.073 "base_bdevs_list": [ 00:28:14.073 { 00:28:14.073 "name": null, 00:28:14.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:14.073 "is_configured": false, 00:28:14.073 "data_offset": 256, 00:28:14.073 "data_size": 7936 00:28:14.073 }, 00:28:14.073 { 00:28:14.073 "name": "pt2", 00:28:14.073 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:14.073 "is_configured": true, 00:28:14.073 "data_offset": 256, 00:28:14.073 "data_size": 7936 00:28:14.073 } 00:28:14.073 ] 00:28:14.073 }' 00:28:14.073 00:23:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:14.073 00:23:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:14.639 00:23:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:14.897 [2024-07-16 00:23:01.643311] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:14.897 [2024-07-16 00:23:01.643339] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:14.897 [2024-07-16 00:23:01.643387] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:14.897 [2024-07-16 00:23:01.643429] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:14.897 [2024-07-16 00:23:01.643440] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf44d10 name raid_bdev1, state offline 00:28:14.897 00:23:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.897 00:23:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:15.155 00:23:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:15.155 00:23:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:15.155 00:23:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:15.155 00:23:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:15.413 [2024-07-16 00:23:02.144624] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:15.413 [2024-07-16 00:23:02.144672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:15.413 [2024-07-16 00:23:02.144690] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf43760 00:28:15.413 [2024-07-16 00:23:02.144703] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:15.413 [2024-07-16 00:23:02.146129] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:15.413 [2024-07-16 00:23:02.146155] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:15.413 [2024-07-16 00:23:02.146200] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:15.413 [2024-07-16 00:23:02.146223] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:15.413 [2024-07-16 00:23:02.146312] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:15.413 [2024-07-16 00:23:02.146325] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:15.413 [2024-07-16 00:23:02.146340] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf45850 name raid_bdev1, state configuring 00:28:15.413 [2024-07-16 00:23:02.146362] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:15.413 [2024-07-16 00:23:02.146411] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf44850 00:28:15.413 [2024-07-16 00:23:02.146421] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:15.413 [2024-07-16 00:23:02.146474] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf453b0 00:28:15.413 [2024-07-16 00:23:02.146572] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf44850 00:28:15.413 [2024-07-16 00:23:02.146593] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf44850 00:28:15.413 [2024-07-16 00:23:02.146666] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:15.413 pt1 00:28:15.413 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:15.413 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:15.413 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:15.413 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:15.414 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:15.414 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:15.414 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:15.414 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:15.414 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:15.414 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:15.414 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:15.414 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.414 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.672 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:15.672 "name": "raid_bdev1", 00:28:15.672 "uuid": "ac1574a2-98d5-460b-8a08-7ef71450329c", 00:28:15.672 "strip_size_kb": 0, 00:28:15.672 "state": "online", 00:28:15.672 "raid_level": "raid1", 00:28:15.672 "superblock": true, 00:28:15.672 "num_base_bdevs": 2, 00:28:15.672 "num_base_bdevs_discovered": 1, 00:28:15.672 "num_base_bdevs_operational": 1, 00:28:15.672 "base_bdevs_list": [ 00:28:15.672 { 00:28:15.672 "name": null, 00:28:15.672 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.672 "is_configured": false, 00:28:15.672 "data_offset": 256, 00:28:15.672 "data_size": 7936 00:28:15.672 }, 00:28:15.672 { 00:28:15.672 "name": "pt2", 00:28:15.672 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:15.672 "is_configured": true, 00:28:15.672 "data_offset": 256, 00:28:15.672 "data_size": 7936 00:28:15.672 } 00:28:15.672 ] 00:28:15.672 }' 00:28:15.672 00:23:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:15.672 00:23:02 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:16.239 00:23:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:16.239 00:23:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:16.496 00:23:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:16.496 00:23:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:16.496 00:23:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:16.754 [2024-07-16 00:23:03.504482] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' ac1574a2-98d5-460b-8a08-7ef71450329c '!=' ac1574a2-98d5-460b-8a08-7ef71450329c ']' 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 3640890 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 3640890 ']' 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 3640890 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3640890 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3640890' 00:28:16.754 killing process with pid 3640890 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 3640890 00:28:16.754 [2024-07-16 00:23:03.573878] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:16.754 [2024-07-16 00:23:03.573931] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:16.754 [2024-07-16 00:23:03.573973] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:16.754 [2024-07-16 00:23:03.573984] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf44850 name raid_bdev1, state offline 00:28:16.754 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 3640890 00:28:16.754 [2024-07-16 00:23:03.600132] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:17.012 00:23:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:28:17.012 00:28:17.012 real 0m16.189s 00:28:17.012 user 0m29.417s 00:28:17.012 sys 0m2.933s 00:28:17.012 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:17.012 00:23:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:17.012 ************************************ 00:28:17.012 END TEST raid_superblock_test_md_separate 00:28:17.012 ************************************ 00:28:17.012 00:23:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:17.012 00:23:03 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:28:17.012 00:23:03 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:28:17.012 00:23:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:17.012 00:23:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:17.012 00:23:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:17.012 ************************************ 00:28:17.012 START TEST raid_rebuild_test_sb_md_separate 00:28:17.012 ************************************ 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=3643304 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 3643304 /var/tmp/spdk-raid.sock 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 3643304 ']' 00:28:17.012 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:17.013 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:17.013 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:17.013 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:17.013 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:17.013 00:23:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:17.271 [2024-07-16 00:23:03.984206] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:28:17.271 [2024-07-16 00:23:03.984294] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3643304 ] 00:28:17.271 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:17.271 Zero copy mechanism will not be used. 00:28:17.271 [2024-07-16 00:23:04.103457] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:17.271 [2024-07-16 00:23:04.209366] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:17.530 [2024-07-16 00:23:04.277322] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:17.530 [2024-07-16 00:23:04.277362] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:18.096 00:23:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:18.096 00:23:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:18.096 00:23:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:18.096 00:23:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:28:18.663 BaseBdev1_malloc 00:28:18.663 00:23:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:18.922 [2024-07-16 00:23:05.657282] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:18.922 [2024-07-16 00:23:05.657337] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:18.922 [2024-07-16 00:23:05.657363] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10966d0 00:28:18.922 [2024-07-16 00:23:05.657376] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:18.922 [2024-07-16 00:23:05.658894] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:18.922 [2024-07-16 00:23:05.658922] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:18.922 BaseBdev1 00:28:18.922 00:23:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:18.922 00:23:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:28:19.492 BaseBdev2_malloc 00:28:19.492 00:23:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:19.492 [2024-07-16 00:23:06.422254] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:19.492 [2024-07-16 00:23:06.422303] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:19.492 [2024-07-16 00:23:06.422328] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11ee1f0 00:28:19.492 [2024-07-16 00:23:06.422341] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:19.492 [2024-07-16 00:23:06.423755] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:19.492 [2024-07-16 00:23:06.423782] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:19.492 BaseBdev2 00:28:19.492 00:23:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:28:20.061 spare_malloc 00:28:20.061 00:23:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:20.319 spare_delay 00:28:20.319 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:20.886 [2024-07-16 00:23:07.678862] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:20.886 [2024-07-16 00:23:07.678912] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:20.886 [2024-07-16 00:23:07.678944] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11ea7a0 00:28:20.886 [2024-07-16 00:23:07.678957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:20.886 [2024-07-16 00:23:07.680419] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:20.886 [2024-07-16 00:23:07.680447] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:20.886 spare 00:28:20.886 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:21.145 [2024-07-16 00:23:07.935557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:21.145 [2024-07-16 00:23:07.936905] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:21.145 [2024-07-16 00:23:07.937084] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11eb1c0 00:28:21.145 [2024-07-16 00:23:07.937098] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:21.145 [2024-07-16 00:23:07.937175] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10fc360 00:28:21.145 [2024-07-16 00:23:07.937290] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11eb1c0 00:28:21.145 [2024-07-16 00:23:07.937305] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11eb1c0 00:28:21.145 [2024-07-16 00:23:07.937380] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.145 00:23:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.404 00:23:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:21.404 "name": "raid_bdev1", 00:28:21.404 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:21.404 "strip_size_kb": 0, 00:28:21.404 "state": "online", 00:28:21.404 "raid_level": "raid1", 00:28:21.404 "superblock": true, 00:28:21.404 "num_base_bdevs": 2, 00:28:21.404 "num_base_bdevs_discovered": 2, 00:28:21.404 "num_base_bdevs_operational": 2, 00:28:21.404 "base_bdevs_list": [ 00:28:21.404 { 00:28:21.404 "name": "BaseBdev1", 00:28:21.404 "uuid": "fa860e2f-4f05-5067-8517-0f4c93003de6", 00:28:21.404 "is_configured": true, 00:28:21.404 "data_offset": 256, 00:28:21.404 "data_size": 7936 00:28:21.404 }, 00:28:21.404 { 00:28:21.404 "name": "BaseBdev2", 00:28:21.404 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:21.404 "is_configured": true, 00:28:21.404 "data_offset": 256, 00:28:21.404 "data_size": 7936 00:28:21.404 } 00:28:21.404 ] 00:28:21.404 }' 00:28:21.404 00:23:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:21.404 00:23:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:21.969 00:23:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:21.969 00:23:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:22.225 [2024-07-16 00:23:09.022667] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:22.225 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:22.226 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.226 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:22.483 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:22.740 [2024-07-16 00:23:09.527791] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10fc360 00:28:22.740 /dev/nbd0 00:28:22.740 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:22.740 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:22.740 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:22.740 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:22.740 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:22.740 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:22.740 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:22.740 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:22.740 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:22.740 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:22.741 1+0 records in 00:28:22.741 1+0 records out 00:28:22.741 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024954 s, 16.4 MB/s 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:28:22.741 00:23:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:23.673 7936+0 records in 00:28:23.673 7936+0 records out 00:28:23.673 32505856 bytes (33 MB, 31 MiB) copied, 0.769177 s, 42.3 MB/s 00:28:23.673 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:23.673 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:23.673 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:23.673 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:23.673 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:23.673 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:23.673 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:23.931 [2024-07-16 00:23:10.642295] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:23.931 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:23.931 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:23.931 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:23.931 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:23.931 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:23.931 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:23.931 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:23.931 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:23.931 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:24.190 [2024-07-16 00:23:10.882996] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.190 00:23:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:24.756 00:23:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:24.756 "name": "raid_bdev1", 00:28:24.756 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:24.756 "strip_size_kb": 0, 00:28:24.756 "state": "online", 00:28:24.756 "raid_level": "raid1", 00:28:24.756 "superblock": true, 00:28:24.756 "num_base_bdevs": 2, 00:28:24.756 "num_base_bdevs_discovered": 1, 00:28:24.756 "num_base_bdevs_operational": 1, 00:28:24.756 "base_bdevs_list": [ 00:28:24.756 { 00:28:24.756 "name": null, 00:28:24.756 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:24.756 "is_configured": false, 00:28:24.756 "data_offset": 256, 00:28:24.756 "data_size": 7936 00:28:24.756 }, 00:28:24.756 { 00:28:24.756 "name": "BaseBdev2", 00:28:24.756 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:24.756 "is_configured": true, 00:28:24.756 "data_offset": 256, 00:28:24.756 "data_size": 7936 00:28:24.756 } 00:28:24.756 ] 00:28:24.756 }' 00:28:24.756 00:23:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:24.756 00:23:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:25.322 00:23:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:25.322 [2024-07-16 00:23:12.250629] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:25.322 [2024-07-16 00:23:12.252915] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1095350 00:28:25.322 [2024-07-16 00:23:12.255199] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:25.578 00:23:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:26.511 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:26.511 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:26.511 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:26.511 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:26.511 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:26.511 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.511 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.768 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:26.768 "name": "raid_bdev1", 00:28:26.768 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:26.768 "strip_size_kb": 0, 00:28:26.768 "state": "online", 00:28:26.768 "raid_level": "raid1", 00:28:26.768 "superblock": true, 00:28:26.768 "num_base_bdevs": 2, 00:28:26.768 "num_base_bdevs_discovered": 2, 00:28:26.768 "num_base_bdevs_operational": 2, 00:28:26.768 "process": { 00:28:26.768 "type": "rebuild", 00:28:26.768 "target": "spare", 00:28:26.768 "progress": { 00:28:26.768 "blocks": 3072, 00:28:26.768 "percent": 38 00:28:26.768 } 00:28:26.768 }, 00:28:26.768 "base_bdevs_list": [ 00:28:26.768 { 00:28:26.768 "name": "spare", 00:28:26.768 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:26.768 "is_configured": true, 00:28:26.768 "data_offset": 256, 00:28:26.768 "data_size": 7936 00:28:26.768 }, 00:28:26.768 { 00:28:26.768 "name": "BaseBdev2", 00:28:26.768 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:26.768 "is_configured": true, 00:28:26.768 "data_offset": 256, 00:28:26.768 "data_size": 7936 00:28:26.768 } 00:28:26.768 ] 00:28:26.768 }' 00:28:26.768 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:26.768 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:26.768 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:26.768 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:26.768 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:27.026 [2024-07-16 00:23:13.844401] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:27.026 [2024-07-16 00:23:13.867709] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:27.026 [2024-07-16 00:23:13.867756] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:27.026 [2024-07-16 00:23:13.867771] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:27.026 [2024-07-16 00:23:13.867780] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.026 00:23:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:27.334 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:27.334 "name": "raid_bdev1", 00:28:27.334 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:27.334 "strip_size_kb": 0, 00:28:27.334 "state": "online", 00:28:27.334 "raid_level": "raid1", 00:28:27.334 "superblock": true, 00:28:27.334 "num_base_bdevs": 2, 00:28:27.334 "num_base_bdevs_discovered": 1, 00:28:27.334 "num_base_bdevs_operational": 1, 00:28:27.334 "base_bdevs_list": [ 00:28:27.334 { 00:28:27.334 "name": null, 00:28:27.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:27.334 "is_configured": false, 00:28:27.334 "data_offset": 256, 00:28:27.334 "data_size": 7936 00:28:27.334 }, 00:28:27.334 { 00:28:27.334 "name": "BaseBdev2", 00:28:27.334 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:27.334 "is_configured": true, 00:28:27.334 "data_offset": 256, 00:28:27.334 "data_size": 7936 00:28:27.334 } 00:28:27.334 ] 00:28:27.334 }' 00:28:27.334 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:27.334 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:27.899 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:27.899 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:27.899 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:27.899 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:27.899 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:27.899 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.899 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:28.157 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:28.157 "name": "raid_bdev1", 00:28:28.157 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:28.157 "strip_size_kb": 0, 00:28:28.157 "state": "online", 00:28:28.157 "raid_level": "raid1", 00:28:28.157 "superblock": true, 00:28:28.157 "num_base_bdevs": 2, 00:28:28.157 "num_base_bdevs_discovered": 1, 00:28:28.157 "num_base_bdevs_operational": 1, 00:28:28.157 "base_bdevs_list": [ 00:28:28.157 { 00:28:28.157 "name": null, 00:28:28.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:28.157 "is_configured": false, 00:28:28.157 "data_offset": 256, 00:28:28.157 "data_size": 7936 00:28:28.157 }, 00:28:28.157 { 00:28:28.157 "name": "BaseBdev2", 00:28:28.157 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:28.157 "is_configured": true, 00:28:28.157 "data_offset": 256, 00:28:28.157 "data_size": 7936 00:28:28.157 } 00:28:28.157 ] 00:28:28.157 }' 00:28:28.157 00:23:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:28.157 00:23:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:28.157 00:23:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:28.157 00:23:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:28.157 00:23:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:28.414 [2024-07-16 00:23:15.290565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:28.414 [2024-07-16 00:23:15.293160] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1096280 00:28:28.414 [2024-07-16 00:23:15.294728] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:28.414 00:23:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:29.789 "name": "raid_bdev1", 00:28:29.789 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:29.789 "strip_size_kb": 0, 00:28:29.789 "state": "online", 00:28:29.789 "raid_level": "raid1", 00:28:29.789 "superblock": true, 00:28:29.789 "num_base_bdevs": 2, 00:28:29.789 "num_base_bdevs_discovered": 2, 00:28:29.789 "num_base_bdevs_operational": 2, 00:28:29.789 "process": { 00:28:29.789 "type": "rebuild", 00:28:29.789 "target": "spare", 00:28:29.789 "progress": { 00:28:29.789 "blocks": 3072, 00:28:29.789 "percent": 38 00:28:29.789 } 00:28:29.789 }, 00:28:29.789 "base_bdevs_list": [ 00:28:29.789 { 00:28:29.789 "name": "spare", 00:28:29.789 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:29.789 "is_configured": true, 00:28:29.789 "data_offset": 256, 00:28:29.789 "data_size": 7936 00:28:29.789 }, 00:28:29.789 { 00:28:29.789 "name": "BaseBdev2", 00:28:29.789 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:29.789 "is_configured": true, 00:28:29.789 "data_offset": 256, 00:28:29.789 "data_size": 7936 00:28:29.789 } 00:28:29.789 ] 00:28:29.789 }' 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:29.789 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1106 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.789 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:30.048 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:30.048 "name": "raid_bdev1", 00:28:30.048 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:30.048 "strip_size_kb": 0, 00:28:30.048 "state": "online", 00:28:30.048 "raid_level": "raid1", 00:28:30.048 "superblock": true, 00:28:30.048 "num_base_bdevs": 2, 00:28:30.048 "num_base_bdevs_discovered": 2, 00:28:30.048 "num_base_bdevs_operational": 2, 00:28:30.048 "process": { 00:28:30.048 "type": "rebuild", 00:28:30.048 "target": "spare", 00:28:30.048 "progress": { 00:28:30.048 "blocks": 3840, 00:28:30.048 "percent": 48 00:28:30.048 } 00:28:30.048 }, 00:28:30.048 "base_bdevs_list": [ 00:28:30.048 { 00:28:30.048 "name": "spare", 00:28:30.048 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:30.048 "is_configured": true, 00:28:30.048 "data_offset": 256, 00:28:30.048 "data_size": 7936 00:28:30.048 }, 00:28:30.048 { 00:28:30.048 "name": "BaseBdev2", 00:28:30.048 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:30.048 "is_configured": true, 00:28:30.048 "data_offset": 256, 00:28:30.048 "data_size": 7936 00:28:30.048 } 00:28:30.048 ] 00:28:30.048 }' 00:28:30.048 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:30.048 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:30.048 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:30.048 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:30.048 00:23:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:31.421 00:23:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:31.421 00:23:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:31.421 00:23:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:31.421 00:23:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:31.421 00:23:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:31.421 00:23:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:31.421 00:23:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.421 00:23:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.421 00:23:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:31.421 "name": "raid_bdev1", 00:28:31.421 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:31.421 "strip_size_kb": 0, 00:28:31.421 "state": "online", 00:28:31.421 "raid_level": "raid1", 00:28:31.421 "superblock": true, 00:28:31.421 "num_base_bdevs": 2, 00:28:31.421 "num_base_bdevs_discovered": 2, 00:28:31.421 "num_base_bdevs_operational": 2, 00:28:31.421 "process": { 00:28:31.421 "type": "rebuild", 00:28:31.421 "target": "spare", 00:28:31.421 "progress": { 00:28:31.421 "blocks": 7168, 00:28:31.421 "percent": 90 00:28:31.421 } 00:28:31.421 }, 00:28:31.421 "base_bdevs_list": [ 00:28:31.421 { 00:28:31.421 "name": "spare", 00:28:31.421 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:31.421 "is_configured": true, 00:28:31.421 "data_offset": 256, 00:28:31.421 "data_size": 7936 00:28:31.421 }, 00:28:31.421 { 00:28:31.421 "name": "BaseBdev2", 00:28:31.421 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:31.421 "is_configured": true, 00:28:31.421 "data_offset": 256, 00:28:31.421 "data_size": 7936 00:28:31.421 } 00:28:31.421 ] 00:28:31.421 }' 00:28:31.421 00:23:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:31.421 00:23:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:31.421 00:23:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:31.421 00:23:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:31.421 00:23:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:31.678 [2024-07-16 00:23:18.419303] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:31.678 [2024-07-16 00:23:18.419372] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:31.679 [2024-07-16 00:23:18.419452] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:32.610 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:32.610 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:32.610 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:32.610 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:32.610 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:32.610 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:32.610 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.610 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:32.867 "name": "raid_bdev1", 00:28:32.867 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:32.867 "strip_size_kb": 0, 00:28:32.867 "state": "online", 00:28:32.867 "raid_level": "raid1", 00:28:32.867 "superblock": true, 00:28:32.867 "num_base_bdevs": 2, 00:28:32.867 "num_base_bdevs_discovered": 2, 00:28:32.867 "num_base_bdevs_operational": 2, 00:28:32.867 "base_bdevs_list": [ 00:28:32.867 { 00:28:32.867 "name": "spare", 00:28:32.867 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:32.867 "is_configured": true, 00:28:32.867 "data_offset": 256, 00:28:32.867 "data_size": 7936 00:28:32.867 }, 00:28:32.867 { 00:28:32.867 "name": "BaseBdev2", 00:28:32.867 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:32.867 "is_configured": true, 00:28:32.867 "data_offset": 256, 00:28:32.867 "data_size": 7936 00:28:32.867 } 00:28:32.867 ] 00:28:32.867 }' 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.867 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:33.126 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:33.126 "name": "raid_bdev1", 00:28:33.126 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:33.126 "strip_size_kb": 0, 00:28:33.126 "state": "online", 00:28:33.126 "raid_level": "raid1", 00:28:33.126 "superblock": true, 00:28:33.126 "num_base_bdevs": 2, 00:28:33.126 "num_base_bdevs_discovered": 2, 00:28:33.126 "num_base_bdevs_operational": 2, 00:28:33.126 "base_bdevs_list": [ 00:28:33.126 { 00:28:33.126 "name": "spare", 00:28:33.126 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:33.126 "is_configured": true, 00:28:33.126 "data_offset": 256, 00:28:33.126 "data_size": 7936 00:28:33.126 }, 00:28:33.126 { 00:28:33.126 "name": "BaseBdev2", 00:28:33.126 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:33.126 "is_configured": true, 00:28:33.126 "data_offset": 256, 00:28:33.126 "data_size": 7936 00:28:33.126 } 00:28:33.126 ] 00:28:33.126 }' 00:28:33.126 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:33.126 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:33.126 00:23:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:33.126 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:33.384 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:33.384 "name": "raid_bdev1", 00:28:33.384 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:33.384 "strip_size_kb": 0, 00:28:33.384 "state": "online", 00:28:33.384 "raid_level": "raid1", 00:28:33.384 "superblock": true, 00:28:33.384 "num_base_bdevs": 2, 00:28:33.384 "num_base_bdevs_discovered": 2, 00:28:33.384 "num_base_bdevs_operational": 2, 00:28:33.384 "base_bdevs_list": [ 00:28:33.384 { 00:28:33.384 "name": "spare", 00:28:33.384 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:33.384 "is_configured": true, 00:28:33.384 "data_offset": 256, 00:28:33.384 "data_size": 7936 00:28:33.384 }, 00:28:33.384 { 00:28:33.384 "name": "BaseBdev2", 00:28:33.384 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:33.384 "is_configured": true, 00:28:33.384 "data_offset": 256, 00:28:33.384 "data_size": 7936 00:28:33.384 } 00:28:33.384 ] 00:28:33.384 }' 00:28:33.384 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:33.384 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:33.950 00:23:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:34.517 [2024-07-16 00:23:21.346615] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:34.517 [2024-07-16 00:23:21.346641] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:34.517 [2024-07-16 00:23:21.346696] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:34.517 [2024-07-16 00:23:21.346749] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:34.517 [2024-07-16 00:23:21.346761] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11eb1c0 name raid_bdev1, state offline 00:28:34.517 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.517 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:28:34.775 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:34.775 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:34.776 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:35.034 /dev/nbd0 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:35.034 1+0 records in 00:28:35.034 1+0 records out 00:28:35.034 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232251 s, 17.6 MB/s 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:35.034 00:23:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:35.292 /dev/nbd1 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:35.292 1+0 records in 00:28:35.292 1+0 records out 00:28:35.292 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317491 s, 12.9 MB/s 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:35.292 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:35.550 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:35.550 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:35.550 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:35.550 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:35.550 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:35.550 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:35.550 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:35.809 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:35.809 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:35.809 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:35.809 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:35.809 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:35.809 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:35.809 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:35.809 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:35.809 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:35.809 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:36.067 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:36.067 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:36.067 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:36.067 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:36.067 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:36.067 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:36.067 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:36.067 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:36.067 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:36.067 00:23:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:36.324 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:36.582 [2024-07-16 00:23:23.320101] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:36.582 [2024-07-16 00:23:23.320149] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:36.582 [2024-07-16 00:23:23.320171] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1230110 00:28:36.582 [2024-07-16 00:23:23.320184] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:36.582 [2024-07-16 00:23:23.321630] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:36.582 [2024-07-16 00:23:23.321658] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:36.582 [2024-07-16 00:23:23.321719] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:36.582 [2024-07-16 00:23:23.321744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:36.582 [2024-07-16 00:23:23.321841] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:36.582 spare 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.582 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:36.582 [2024-07-16 00:23:23.422149] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10fc620 00:28:36.582 [2024-07-16 00:23:23.422164] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:36.582 [2024-07-16 00:23:23.422233] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e9660 00:28:36.582 [2024-07-16 00:23:23.422352] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10fc620 00:28:36.582 [2024-07-16 00:23:23.422362] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10fc620 00:28:36.582 [2024-07-16 00:23:23.422442] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:36.840 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:36.840 "name": "raid_bdev1", 00:28:36.840 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:36.840 "strip_size_kb": 0, 00:28:36.840 "state": "online", 00:28:36.840 "raid_level": "raid1", 00:28:36.840 "superblock": true, 00:28:36.840 "num_base_bdevs": 2, 00:28:36.840 "num_base_bdevs_discovered": 2, 00:28:36.840 "num_base_bdevs_operational": 2, 00:28:36.840 "base_bdevs_list": [ 00:28:36.840 { 00:28:36.840 "name": "spare", 00:28:36.840 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:36.840 "is_configured": true, 00:28:36.840 "data_offset": 256, 00:28:36.840 "data_size": 7936 00:28:36.840 }, 00:28:36.840 { 00:28:36.840 "name": "BaseBdev2", 00:28:36.840 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:36.840 "is_configured": true, 00:28:36.840 "data_offset": 256, 00:28:36.840 "data_size": 7936 00:28:36.840 } 00:28:36.840 ] 00:28:36.840 }' 00:28:36.840 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:36.840 00:23:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:37.442 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:37.442 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:37.442 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:37.442 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:37.442 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:37.442 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.442 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:37.700 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:37.700 "name": "raid_bdev1", 00:28:37.700 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:37.700 "strip_size_kb": 0, 00:28:37.700 "state": "online", 00:28:37.700 "raid_level": "raid1", 00:28:37.700 "superblock": true, 00:28:37.700 "num_base_bdevs": 2, 00:28:37.700 "num_base_bdevs_discovered": 2, 00:28:37.700 "num_base_bdevs_operational": 2, 00:28:37.700 "base_bdevs_list": [ 00:28:37.700 { 00:28:37.700 "name": "spare", 00:28:37.700 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:37.700 "is_configured": true, 00:28:37.700 "data_offset": 256, 00:28:37.700 "data_size": 7936 00:28:37.700 }, 00:28:37.700 { 00:28:37.700 "name": "BaseBdev2", 00:28:37.700 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:37.700 "is_configured": true, 00:28:37.700 "data_offset": 256, 00:28:37.700 "data_size": 7936 00:28:37.700 } 00:28:37.700 ] 00:28:37.700 }' 00:28:37.700 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:37.700 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:37.700 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:37.700 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:37.700 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.700 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:37.957 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:37.957 00:23:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:38.214 [2024-07-16 00:23:25.020730] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.214 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.471 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:38.471 "name": "raid_bdev1", 00:28:38.471 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:38.471 "strip_size_kb": 0, 00:28:38.471 "state": "online", 00:28:38.471 "raid_level": "raid1", 00:28:38.471 "superblock": true, 00:28:38.471 "num_base_bdevs": 2, 00:28:38.471 "num_base_bdevs_discovered": 1, 00:28:38.471 "num_base_bdevs_operational": 1, 00:28:38.471 "base_bdevs_list": [ 00:28:38.471 { 00:28:38.471 "name": null, 00:28:38.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:38.471 "is_configured": false, 00:28:38.471 "data_offset": 256, 00:28:38.471 "data_size": 7936 00:28:38.471 }, 00:28:38.471 { 00:28:38.471 "name": "BaseBdev2", 00:28:38.471 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:38.471 "is_configured": true, 00:28:38.471 "data_offset": 256, 00:28:38.471 "data_size": 7936 00:28:38.471 } 00:28:38.471 ] 00:28:38.471 }' 00:28:38.471 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:38.471 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:39.035 00:23:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:39.292 [2024-07-16 00:23:26.111650] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:39.292 [2024-07-16 00:23:26.111803] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:39.292 [2024-07-16 00:23:26.111821] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:39.292 [2024-07-16 00:23:26.111849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:39.292 [2024-07-16 00:23:26.114028] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1230ab0 00:28:39.292 [2024-07-16 00:23:26.115332] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:39.292 00:23:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:40.223 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:40.223 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:40.223 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:40.223 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:40.223 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:40.223 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.223 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.481 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:40.481 "name": "raid_bdev1", 00:28:40.482 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:40.482 "strip_size_kb": 0, 00:28:40.482 "state": "online", 00:28:40.482 "raid_level": "raid1", 00:28:40.482 "superblock": true, 00:28:40.482 "num_base_bdevs": 2, 00:28:40.482 "num_base_bdevs_discovered": 2, 00:28:40.482 "num_base_bdevs_operational": 2, 00:28:40.482 "process": { 00:28:40.482 "type": "rebuild", 00:28:40.482 "target": "spare", 00:28:40.482 "progress": { 00:28:40.482 "blocks": 3072, 00:28:40.482 "percent": 38 00:28:40.482 } 00:28:40.482 }, 00:28:40.482 "base_bdevs_list": [ 00:28:40.482 { 00:28:40.482 "name": "spare", 00:28:40.482 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:40.482 "is_configured": true, 00:28:40.482 "data_offset": 256, 00:28:40.482 "data_size": 7936 00:28:40.482 }, 00:28:40.482 { 00:28:40.482 "name": "BaseBdev2", 00:28:40.482 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:40.482 "is_configured": true, 00:28:40.482 "data_offset": 256, 00:28:40.482 "data_size": 7936 00:28:40.482 } 00:28:40.482 ] 00:28:40.482 }' 00:28:40.482 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:40.740 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:40.740 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:40.740 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:40.740 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:40.997 [2024-07-16 00:23:27.708956] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:40.997 [2024-07-16 00:23:27.727949] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:40.997 [2024-07-16 00:23:27.727991] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:40.997 [2024-07-16 00:23:27.728006] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:40.997 [2024-07-16 00:23:27.728015] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:40.997 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:40.997 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:40.997 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:40.997 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:40.997 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:40.997 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:40.997 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:40.997 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:40.998 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:40.998 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:40.998 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.998 00:23:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.255 00:23:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:41.255 "name": "raid_bdev1", 00:28:41.255 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:41.255 "strip_size_kb": 0, 00:28:41.255 "state": "online", 00:28:41.255 "raid_level": "raid1", 00:28:41.255 "superblock": true, 00:28:41.255 "num_base_bdevs": 2, 00:28:41.255 "num_base_bdevs_discovered": 1, 00:28:41.255 "num_base_bdevs_operational": 1, 00:28:41.255 "base_bdevs_list": [ 00:28:41.255 { 00:28:41.255 "name": null, 00:28:41.255 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:41.255 "is_configured": false, 00:28:41.255 "data_offset": 256, 00:28:41.255 "data_size": 7936 00:28:41.255 }, 00:28:41.255 { 00:28:41.255 "name": "BaseBdev2", 00:28:41.255 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:41.255 "is_configured": true, 00:28:41.255 "data_offset": 256, 00:28:41.255 "data_size": 7936 00:28:41.255 } 00:28:41.255 ] 00:28:41.255 }' 00:28:41.255 00:23:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:41.255 00:23:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:41.850 00:23:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:42.108 [2024-07-16 00:23:28.834047] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:42.108 [2024-07-16 00:23:28.834096] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:42.108 [2024-07-16 00:23:28.834122] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10fe570 00:28:42.108 [2024-07-16 00:23:28.834134] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:42.108 [2024-07-16 00:23:28.834348] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:42.108 [2024-07-16 00:23:28.834365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:42.108 [2024-07-16 00:23:28.834423] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:42.108 [2024-07-16 00:23:28.834435] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:42.108 [2024-07-16 00:23:28.834446] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:42.108 [2024-07-16 00:23:28.834463] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:42.108 [2024-07-16 00:23:28.836646] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10fe800 00:28:42.108 [2024-07-16 00:23:28.837958] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:42.108 spare 00:28:42.108 00:23:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:43.041 00:23:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:43.041 00:23:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:43.041 00:23:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:43.041 00:23:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:43.041 00:23:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:43.041 00:23:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.041 00:23:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.299 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:43.299 "name": "raid_bdev1", 00:28:43.299 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:43.299 "strip_size_kb": 0, 00:28:43.299 "state": "online", 00:28:43.299 "raid_level": "raid1", 00:28:43.299 "superblock": true, 00:28:43.299 "num_base_bdevs": 2, 00:28:43.299 "num_base_bdevs_discovered": 2, 00:28:43.299 "num_base_bdevs_operational": 2, 00:28:43.299 "process": { 00:28:43.299 "type": "rebuild", 00:28:43.299 "target": "spare", 00:28:43.299 "progress": { 00:28:43.299 "blocks": 3072, 00:28:43.299 "percent": 38 00:28:43.299 } 00:28:43.299 }, 00:28:43.299 "base_bdevs_list": [ 00:28:43.299 { 00:28:43.299 "name": "spare", 00:28:43.299 "uuid": "edad62f5-6e9a-5a56-a45e-61d77f9bdc58", 00:28:43.299 "is_configured": true, 00:28:43.299 "data_offset": 256, 00:28:43.299 "data_size": 7936 00:28:43.299 }, 00:28:43.299 { 00:28:43.299 "name": "BaseBdev2", 00:28:43.299 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:43.299 "is_configured": true, 00:28:43.299 "data_offset": 256, 00:28:43.299 "data_size": 7936 00:28:43.299 } 00:28:43.299 ] 00:28:43.299 }' 00:28:43.299 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:43.299 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:43.299 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:43.299 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:43.299 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:43.557 [2024-07-16 00:23:30.439472] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:43.557 [2024-07-16 00:23:30.450280] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:43.557 [2024-07-16 00:23:30.450321] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:43.557 [2024-07-16 00:23:30.450336] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:43.557 [2024-07-16 00:23:30.450345] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.557 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.815 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.815 "name": "raid_bdev1", 00:28:43.815 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:43.815 "strip_size_kb": 0, 00:28:43.815 "state": "online", 00:28:43.815 "raid_level": "raid1", 00:28:43.815 "superblock": true, 00:28:43.815 "num_base_bdevs": 2, 00:28:43.815 "num_base_bdevs_discovered": 1, 00:28:43.815 "num_base_bdevs_operational": 1, 00:28:43.815 "base_bdevs_list": [ 00:28:43.815 { 00:28:43.815 "name": null, 00:28:43.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:43.815 "is_configured": false, 00:28:43.815 "data_offset": 256, 00:28:43.815 "data_size": 7936 00:28:43.815 }, 00:28:43.815 { 00:28:43.815 "name": "BaseBdev2", 00:28:43.815 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:43.815 "is_configured": true, 00:28:43.815 "data_offset": 256, 00:28:43.815 "data_size": 7936 00:28:43.815 } 00:28:43.815 ] 00:28:43.815 }' 00:28:43.815 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.815 00:23:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:44.381 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:44.381 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:44.381 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:44.381 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:44.381 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:44.381 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.381 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:44.659 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:44.659 "name": "raid_bdev1", 00:28:44.659 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:44.659 "strip_size_kb": 0, 00:28:44.659 "state": "online", 00:28:44.659 "raid_level": "raid1", 00:28:44.659 "superblock": true, 00:28:44.659 "num_base_bdevs": 2, 00:28:44.659 "num_base_bdevs_discovered": 1, 00:28:44.659 "num_base_bdevs_operational": 1, 00:28:44.659 "base_bdevs_list": [ 00:28:44.659 { 00:28:44.659 "name": null, 00:28:44.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:44.659 "is_configured": false, 00:28:44.659 "data_offset": 256, 00:28:44.659 "data_size": 7936 00:28:44.659 }, 00:28:44.659 { 00:28:44.659 "name": "BaseBdev2", 00:28:44.659 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:44.659 "is_configured": true, 00:28:44.659 "data_offset": 256, 00:28:44.659 "data_size": 7936 00:28:44.659 } 00:28:44.659 ] 00:28:44.659 }' 00:28:44.659 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:44.659 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:44.659 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:44.659 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:44.659 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:44.917 00:23:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:45.175 [2024-07-16 00:23:31.978039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:45.175 [2024-07-16 00:23:31.978086] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:45.175 [2024-07-16 00:23:31.978106] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1096900 00:28:45.175 [2024-07-16 00:23:31.978118] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:45.175 [2024-07-16 00:23:31.978302] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:45.175 [2024-07-16 00:23:31.978318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:45.175 [2024-07-16 00:23:31.978363] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:45.175 [2024-07-16 00:23:31.978373] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:45.175 [2024-07-16 00:23:31.978384] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:45.175 BaseBdev1 00:28:45.175 00:23:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.106 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:46.364 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:46.364 "name": "raid_bdev1", 00:28:46.364 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:46.364 "strip_size_kb": 0, 00:28:46.364 "state": "online", 00:28:46.364 "raid_level": "raid1", 00:28:46.364 "superblock": true, 00:28:46.364 "num_base_bdevs": 2, 00:28:46.364 "num_base_bdevs_discovered": 1, 00:28:46.364 "num_base_bdevs_operational": 1, 00:28:46.364 "base_bdevs_list": [ 00:28:46.364 { 00:28:46.364 "name": null, 00:28:46.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:46.364 "is_configured": false, 00:28:46.364 "data_offset": 256, 00:28:46.364 "data_size": 7936 00:28:46.364 }, 00:28:46.364 { 00:28:46.364 "name": "BaseBdev2", 00:28:46.364 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:46.365 "is_configured": true, 00:28:46.365 "data_offset": 256, 00:28:46.365 "data_size": 7936 00:28:46.365 } 00:28:46.365 ] 00:28:46.365 }' 00:28:46.365 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:46.365 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:46.930 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:46.931 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:46.931 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:46.931 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:46.931 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:46.931 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.931 00:23:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.188 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:47.188 "name": "raid_bdev1", 00:28:47.188 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:47.188 "strip_size_kb": 0, 00:28:47.188 "state": "online", 00:28:47.188 "raid_level": "raid1", 00:28:47.188 "superblock": true, 00:28:47.188 "num_base_bdevs": 2, 00:28:47.188 "num_base_bdevs_discovered": 1, 00:28:47.188 "num_base_bdevs_operational": 1, 00:28:47.188 "base_bdevs_list": [ 00:28:47.188 { 00:28:47.188 "name": null, 00:28:47.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:47.188 "is_configured": false, 00:28:47.188 "data_offset": 256, 00:28:47.188 "data_size": 7936 00:28:47.188 }, 00:28:47.188 { 00:28:47.188 "name": "BaseBdev2", 00:28:47.188 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:47.188 "is_configured": true, 00:28:47.188 "data_offset": 256, 00:28:47.188 "data_size": 7936 00:28:47.188 } 00:28:47.188 ] 00:28:47.188 }' 00:28:47.188 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:47.446 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:47.703 [2024-07-16 00:23:34.432567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:47.703 [2024-07-16 00:23:34.432691] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:47.703 [2024-07-16 00:23:34.432707] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:47.703 request: 00:28:47.703 { 00:28:47.703 "base_bdev": "BaseBdev1", 00:28:47.703 "raid_bdev": "raid_bdev1", 00:28:47.703 "method": "bdev_raid_add_base_bdev", 00:28:47.703 "req_id": 1 00:28:47.703 } 00:28:47.703 Got JSON-RPC error response 00:28:47.703 response: 00:28:47.703 { 00:28:47.703 "code": -22, 00:28:47.703 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:47.703 } 00:28:47.703 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:47.703 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:47.703 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:47.703 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:47.703 00:23:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.640 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:48.898 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:48.898 "name": "raid_bdev1", 00:28:48.898 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:48.898 "strip_size_kb": 0, 00:28:48.898 "state": "online", 00:28:48.898 "raid_level": "raid1", 00:28:48.898 "superblock": true, 00:28:48.898 "num_base_bdevs": 2, 00:28:48.898 "num_base_bdevs_discovered": 1, 00:28:48.898 "num_base_bdevs_operational": 1, 00:28:48.898 "base_bdevs_list": [ 00:28:48.898 { 00:28:48.898 "name": null, 00:28:48.898 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.898 "is_configured": false, 00:28:48.898 "data_offset": 256, 00:28:48.898 "data_size": 7936 00:28:48.898 }, 00:28:48.898 { 00:28:48.898 "name": "BaseBdev2", 00:28:48.898 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:48.898 "is_configured": true, 00:28:48.898 "data_offset": 256, 00:28:48.898 "data_size": 7936 00:28:48.898 } 00:28:48.898 ] 00:28:48.898 }' 00:28:48.898 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:48.898 00:23:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:49.461 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:49.461 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:49.461 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:49.461 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:49.461 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:49.462 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.462 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:49.718 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:49.718 "name": "raid_bdev1", 00:28:49.718 "uuid": "1bb33cb6-447e-4849-9347-dc6712bfe8bf", 00:28:49.718 "strip_size_kb": 0, 00:28:49.718 "state": "online", 00:28:49.718 "raid_level": "raid1", 00:28:49.718 "superblock": true, 00:28:49.718 "num_base_bdevs": 2, 00:28:49.718 "num_base_bdevs_discovered": 1, 00:28:49.718 "num_base_bdevs_operational": 1, 00:28:49.718 "base_bdevs_list": [ 00:28:49.718 { 00:28:49.718 "name": null, 00:28:49.718 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:49.718 "is_configured": false, 00:28:49.718 "data_offset": 256, 00:28:49.718 "data_size": 7936 00:28:49.718 }, 00:28:49.718 { 00:28:49.718 "name": "BaseBdev2", 00:28:49.718 "uuid": "b834f001-bfdc-5c74-a6a3-312839e896b8", 00:28:49.718 "is_configured": true, 00:28:49.718 "data_offset": 256, 00:28:49.718 "data_size": 7936 00:28:49.718 } 00:28:49.718 ] 00:28:49.718 }' 00:28:49.718 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:49.718 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:49.718 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:49.719 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:49.719 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 3643304 00:28:49.719 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 3643304 ']' 00:28:49.719 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 3643304 00:28:49.719 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:49.719 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:49.719 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3643304 00:28:49.977 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:49.977 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:49.977 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3643304' 00:28:49.977 killing process with pid 3643304 00:28:49.977 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 3643304 00:28:49.977 Received shutdown signal, test time was about 60.000000 seconds 00:28:49.977 00:28:49.977 Latency(us) 00:28:49.977 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:49.977 =================================================================================================================== 00:28:49.977 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:49.977 [2024-07-16 00:23:36.679336] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:49.977 [2024-07-16 00:23:36.679423] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:49.977 [2024-07-16 00:23:36.679465] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:49.977 [2024-07-16 00:23:36.679478] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10fc620 name raid_bdev1, state offline 00:28:49.977 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 3643304 00:28:49.977 [2024-07-16 00:23:36.715957] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:50.235 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:28:50.235 00:28:50.235 real 0m33.025s 00:28:50.235 user 0m51.909s 00:28:50.235 sys 0m5.360s 00:28:50.235 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:50.235 00:23:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:50.235 ************************************ 00:28:50.235 END TEST raid_rebuild_test_sb_md_separate 00:28:50.236 ************************************ 00:28:50.236 00:23:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:50.236 00:23:36 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:28:50.236 00:23:36 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:28:50.236 00:23:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:50.236 00:23:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:50.236 00:23:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:50.236 ************************************ 00:28:50.236 START TEST raid_state_function_test_sb_md_interleaved 00:28:50.236 ************************************ 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=3647968 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3647968' 00:28:50.236 Process raid pid: 3647968 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 3647968 /var/tmp/spdk-raid.sock 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 3647968 ']' 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:50.236 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:50.236 00:23:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:50.236 [2024-07-16 00:23:37.099611] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:28:50.236 [2024-07-16 00:23:37.099682] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:50.545 [2024-07-16 00:23:37.227458] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:50.545 [2024-07-16 00:23:37.330875] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:50.545 [2024-07-16 00:23:37.383875] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:50.545 [2024-07-16 00:23:37.383905] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:51.125 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:51.125 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:51.125 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:51.381 [2024-07-16 00:23:38.303193] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:51.381 [2024-07-16 00:23:38.303241] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:51.381 [2024-07-16 00:23:38.303252] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:51.381 [2024-07-16 00:23:38.303265] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:51.381 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:51.381 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:51.381 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:51.381 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:51.381 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:51.381 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:51.381 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:51.381 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:51.381 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:51.381 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:51.638 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.638 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:51.638 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:51.638 "name": "Existed_Raid", 00:28:51.638 "uuid": "f5332297-5066-498b-bf03-fa15b60fc50e", 00:28:51.638 "strip_size_kb": 0, 00:28:51.638 "state": "configuring", 00:28:51.638 "raid_level": "raid1", 00:28:51.638 "superblock": true, 00:28:51.638 "num_base_bdevs": 2, 00:28:51.638 "num_base_bdevs_discovered": 0, 00:28:51.638 "num_base_bdevs_operational": 2, 00:28:51.638 "base_bdevs_list": [ 00:28:51.638 { 00:28:51.638 "name": "BaseBdev1", 00:28:51.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:51.638 "is_configured": false, 00:28:51.638 "data_offset": 0, 00:28:51.638 "data_size": 0 00:28:51.638 }, 00:28:51.638 { 00:28:51.638 "name": "BaseBdev2", 00:28:51.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:51.638 "is_configured": false, 00:28:51.638 "data_offset": 0, 00:28:51.638 "data_size": 0 00:28:51.638 } 00:28:51.638 ] 00:28:51.638 }' 00:28:51.638 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:51.638 00:23:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:52.571 00:23:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:52.571 [2024-07-16 00:23:39.409980] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:52.571 [2024-07-16 00:23:39.410016] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2315a80 name Existed_Raid, state configuring 00:28:52.571 00:23:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:52.829 [2024-07-16 00:23:39.650633] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:52.829 [2024-07-16 00:23:39.650670] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:52.829 [2024-07-16 00:23:39.650680] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:52.829 [2024-07-16 00:23:39.650692] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:52.829 00:23:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:28:53.099 [2024-07-16 00:23:39.901329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:53.099 BaseBdev1 00:28:53.099 00:23:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:53.099 00:23:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:53.099 00:23:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:53.099 00:23:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:28:53.099 00:23:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:53.099 00:23:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:53.099 00:23:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:53.356 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:53.614 [ 00:28:53.614 { 00:28:53.614 "name": "BaseBdev1", 00:28:53.614 "aliases": [ 00:28:53.614 "503e334d-96df-4322-be90-c495b87dec80" 00:28:53.614 ], 00:28:53.614 "product_name": "Malloc disk", 00:28:53.614 "block_size": 4128, 00:28:53.614 "num_blocks": 8192, 00:28:53.614 "uuid": "503e334d-96df-4322-be90-c495b87dec80", 00:28:53.614 "md_size": 32, 00:28:53.614 "md_interleave": true, 00:28:53.614 "dif_type": 0, 00:28:53.614 "assigned_rate_limits": { 00:28:53.614 "rw_ios_per_sec": 0, 00:28:53.614 "rw_mbytes_per_sec": 0, 00:28:53.614 "r_mbytes_per_sec": 0, 00:28:53.614 "w_mbytes_per_sec": 0 00:28:53.614 }, 00:28:53.614 "claimed": true, 00:28:53.614 "claim_type": "exclusive_write", 00:28:53.614 "zoned": false, 00:28:53.614 "supported_io_types": { 00:28:53.614 "read": true, 00:28:53.614 "write": true, 00:28:53.614 "unmap": true, 00:28:53.614 "flush": true, 00:28:53.614 "reset": true, 00:28:53.614 "nvme_admin": false, 00:28:53.614 "nvme_io": false, 00:28:53.614 "nvme_io_md": false, 00:28:53.614 "write_zeroes": true, 00:28:53.614 "zcopy": true, 00:28:53.614 "get_zone_info": false, 00:28:53.614 "zone_management": false, 00:28:53.614 "zone_append": false, 00:28:53.614 "compare": false, 00:28:53.614 "compare_and_write": false, 00:28:53.614 "abort": true, 00:28:53.614 "seek_hole": false, 00:28:53.614 "seek_data": false, 00:28:53.614 "copy": true, 00:28:53.614 "nvme_iov_md": false 00:28:53.614 }, 00:28:53.614 "memory_domains": [ 00:28:53.614 { 00:28:53.614 "dma_device_id": "system", 00:28:53.614 "dma_device_type": 1 00:28:53.614 }, 00:28:53.614 { 00:28:53.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:53.614 "dma_device_type": 2 00:28:53.614 } 00:28:53.614 ], 00:28:53.614 "driver_specific": {} 00:28:53.614 } 00:28:53.614 ] 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:53.614 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:53.873 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:53.873 "name": "Existed_Raid", 00:28:53.873 "uuid": "d91dd9cb-88ae-4923-a80f-24bcbe4cab73", 00:28:53.873 "strip_size_kb": 0, 00:28:53.873 "state": "configuring", 00:28:53.873 "raid_level": "raid1", 00:28:53.873 "superblock": true, 00:28:53.873 "num_base_bdevs": 2, 00:28:53.873 "num_base_bdevs_discovered": 1, 00:28:53.873 "num_base_bdevs_operational": 2, 00:28:53.873 "base_bdevs_list": [ 00:28:53.873 { 00:28:53.873 "name": "BaseBdev1", 00:28:53.873 "uuid": "503e334d-96df-4322-be90-c495b87dec80", 00:28:53.873 "is_configured": true, 00:28:53.873 "data_offset": 256, 00:28:53.873 "data_size": 7936 00:28:53.873 }, 00:28:53.873 { 00:28:53.873 "name": "BaseBdev2", 00:28:53.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:53.873 "is_configured": false, 00:28:53.873 "data_offset": 0, 00:28:53.873 "data_size": 0 00:28:53.873 } 00:28:53.873 ] 00:28:53.873 }' 00:28:53.873 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:53.873 00:23:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:54.439 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:54.697 [2024-07-16 00:23:41.489571] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:54.697 [2024-07-16 00:23:41.489613] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2315350 name Existed_Raid, state configuring 00:28:54.697 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:54.955 [2024-07-16 00:23:41.730243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:54.955 [2024-07-16 00:23:41.731719] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:54.955 [2024-07-16 00:23:41.731754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:54.955 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.956 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:55.213 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:55.213 "name": "Existed_Raid", 00:28:55.213 "uuid": "187781d8-5984-4ef8-8ee0-71a41cdeedb6", 00:28:55.213 "strip_size_kb": 0, 00:28:55.213 "state": "configuring", 00:28:55.213 "raid_level": "raid1", 00:28:55.213 "superblock": true, 00:28:55.213 "num_base_bdevs": 2, 00:28:55.213 "num_base_bdevs_discovered": 1, 00:28:55.213 "num_base_bdevs_operational": 2, 00:28:55.213 "base_bdevs_list": [ 00:28:55.213 { 00:28:55.213 "name": "BaseBdev1", 00:28:55.213 "uuid": "503e334d-96df-4322-be90-c495b87dec80", 00:28:55.213 "is_configured": true, 00:28:55.213 "data_offset": 256, 00:28:55.213 "data_size": 7936 00:28:55.213 }, 00:28:55.213 { 00:28:55.213 "name": "BaseBdev2", 00:28:55.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:55.213 "is_configured": false, 00:28:55.213 "data_offset": 0, 00:28:55.213 "data_size": 0 00:28:55.213 } 00:28:55.213 ] 00:28:55.213 }' 00:28:55.213 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:55.213 00:23:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:55.777 00:23:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:28:56.034 [2024-07-16 00:23:42.780654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:56.034 [2024-07-16 00:23:42.780792] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2317180 00:28:56.034 [2024-07-16 00:23:42.780805] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:56.034 [2024-07-16 00:23:42.780866] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2317150 00:28:56.034 [2024-07-16 00:23:42.780962] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2317180 00:28:56.034 [2024-07-16 00:23:42.780973] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2317180 00:28:56.034 [2024-07-16 00:23:42.781028] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:56.034 BaseBdev2 00:28:56.034 00:23:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:56.034 00:23:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:56.034 00:23:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:56.034 00:23:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:28:56.034 00:23:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:56.034 00:23:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:56.034 00:23:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:56.292 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:56.559 [ 00:28:56.559 { 00:28:56.559 "name": "BaseBdev2", 00:28:56.559 "aliases": [ 00:28:56.559 "9c386ce5-21f4-4f60-a371-17928c54e346" 00:28:56.559 ], 00:28:56.559 "product_name": "Malloc disk", 00:28:56.559 "block_size": 4128, 00:28:56.559 "num_blocks": 8192, 00:28:56.559 "uuid": "9c386ce5-21f4-4f60-a371-17928c54e346", 00:28:56.559 "md_size": 32, 00:28:56.559 "md_interleave": true, 00:28:56.559 "dif_type": 0, 00:28:56.559 "assigned_rate_limits": { 00:28:56.559 "rw_ios_per_sec": 0, 00:28:56.559 "rw_mbytes_per_sec": 0, 00:28:56.559 "r_mbytes_per_sec": 0, 00:28:56.559 "w_mbytes_per_sec": 0 00:28:56.559 }, 00:28:56.559 "claimed": true, 00:28:56.559 "claim_type": "exclusive_write", 00:28:56.559 "zoned": false, 00:28:56.559 "supported_io_types": { 00:28:56.559 "read": true, 00:28:56.559 "write": true, 00:28:56.559 "unmap": true, 00:28:56.559 "flush": true, 00:28:56.559 "reset": true, 00:28:56.559 "nvme_admin": false, 00:28:56.559 "nvme_io": false, 00:28:56.559 "nvme_io_md": false, 00:28:56.559 "write_zeroes": true, 00:28:56.559 "zcopy": true, 00:28:56.559 "get_zone_info": false, 00:28:56.559 "zone_management": false, 00:28:56.559 "zone_append": false, 00:28:56.559 "compare": false, 00:28:56.559 "compare_and_write": false, 00:28:56.559 "abort": true, 00:28:56.559 "seek_hole": false, 00:28:56.559 "seek_data": false, 00:28:56.559 "copy": true, 00:28:56.559 "nvme_iov_md": false 00:28:56.559 }, 00:28:56.559 "memory_domains": [ 00:28:56.559 { 00:28:56.559 "dma_device_id": "system", 00:28:56.559 "dma_device_type": 1 00:28:56.559 }, 00:28:56.559 { 00:28:56.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:56.559 "dma_device_type": 2 00:28:56.559 } 00:28:56.559 ], 00:28:56.559 "driver_specific": {} 00:28:56.559 } 00:28:56.559 ] 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.559 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:56.817 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:56.817 "name": "Existed_Raid", 00:28:56.817 "uuid": "187781d8-5984-4ef8-8ee0-71a41cdeedb6", 00:28:56.817 "strip_size_kb": 0, 00:28:56.817 "state": "online", 00:28:56.817 "raid_level": "raid1", 00:28:56.817 "superblock": true, 00:28:56.817 "num_base_bdevs": 2, 00:28:56.817 "num_base_bdevs_discovered": 2, 00:28:56.817 "num_base_bdevs_operational": 2, 00:28:56.817 "base_bdevs_list": [ 00:28:56.817 { 00:28:56.817 "name": "BaseBdev1", 00:28:56.818 "uuid": "503e334d-96df-4322-be90-c495b87dec80", 00:28:56.818 "is_configured": true, 00:28:56.818 "data_offset": 256, 00:28:56.818 "data_size": 7936 00:28:56.818 }, 00:28:56.818 { 00:28:56.818 "name": "BaseBdev2", 00:28:56.818 "uuid": "9c386ce5-21f4-4f60-a371-17928c54e346", 00:28:56.818 "is_configured": true, 00:28:56.818 "data_offset": 256, 00:28:56.818 "data_size": 7936 00:28:56.818 } 00:28:56.818 ] 00:28:56.818 }' 00:28:56.818 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:56.818 00:23:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:57.384 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:57.384 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:57.384 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:57.384 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:57.384 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:57.384 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:57.384 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:57.384 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:57.652 [2024-07-16 00:23:44.385230] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:57.652 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:57.652 "name": "Existed_Raid", 00:28:57.652 "aliases": [ 00:28:57.652 "187781d8-5984-4ef8-8ee0-71a41cdeedb6" 00:28:57.652 ], 00:28:57.652 "product_name": "Raid Volume", 00:28:57.652 "block_size": 4128, 00:28:57.652 "num_blocks": 7936, 00:28:57.652 "uuid": "187781d8-5984-4ef8-8ee0-71a41cdeedb6", 00:28:57.652 "md_size": 32, 00:28:57.652 "md_interleave": true, 00:28:57.652 "dif_type": 0, 00:28:57.652 "assigned_rate_limits": { 00:28:57.652 "rw_ios_per_sec": 0, 00:28:57.652 "rw_mbytes_per_sec": 0, 00:28:57.652 "r_mbytes_per_sec": 0, 00:28:57.652 "w_mbytes_per_sec": 0 00:28:57.652 }, 00:28:57.652 "claimed": false, 00:28:57.652 "zoned": false, 00:28:57.652 "supported_io_types": { 00:28:57.652 "read": true, 00:28:57.652 "write": true, 00:28:57.652 "unmap": false, 00:28:57.652 "flush": false, 00:28:57.652 "reset": true, 00:28:57.652 "nvme_admin": false, 00:28:57.652 "nvme_io": false, 00:28:57.652 "nvme_io_md": false, 00:28:57.652 "write_zeroes": true, 00:28:57.652 "zcopy": false, 00:28:57.652 "get_zone_info": false, 00:28:57.652 "zone_management": false, 00:28:57.652 "zone_append": false, 00:28:57.652 "compare": false, 00:28:57.652 "compare_and_write": false, 00:28:57.652 "abort": false, 00:28:57.652 "seek_hole": false, 00:28:57.652 "seek_data": false, 00:28:57.652 "copy": false, 00:28:57.652 "nvme_iov_md": false 00:28:57.652 }, 00:28:57.652 "memory_domains": [ 00:28:57.652 { 00:28:57.652 "dma_device_id": "system", 00:28:57.652 "dma_device_type": 1 00:28:57.652 }, 00:28:57.652 { 00:28:57.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:57.652 "dma_device_type": 2 00:28:57.652 }, 00:28:57.652 { 00:28:57.652 "dma_device_id": "system", 00:28:57.652 "dma_device_type": 1 00:28:57.652 }, 00:28:57.652 { 00:28:57.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:57.652 "dma_device_type": 2 00:28:57.652 } 00:28:57.652 ], 00:28:57.652 "driver_specific": { 00:28:57.652 "raid": { 00:28:57.652 "uuid": "187781d8-5984-4ef8-8ee0-71a41cdeedb6", 00:28:57.652 "strip_size_kb": 0, 00:28:57.652 "state": "online", 00:28:57.652 "raid_level": "raid1", 00:28:57.652 "superblock": true, 00:28:57.652 "num_base_bdevs": 2, 00:28:57.652 "num_base_bdevs_discovered": 2, 00:28:57.652 "num_base_bdevs_operational": 2, 00:28:57.652 "base_bdevs_list": [ 00:28:57.652 { 00:28:57.652 "name": "BaseBdev1", 00:28:57.652 "uuid": "503e334d-96df-4322-be90-c495b87dec80", 00:28:57.652 "is_configured": true, 00:28:57.652 "data_offset": 256, 00:28:57.652 "data_size": 7936 00:28:57.652 }, 00:28:57.652 { 00:28:57.652 "name": "BaseBdev2", 00:28:57.652 "uuid": "9c386ce5-21f4-4f60-a371-17928c54e346", 00:28:57.652 "is_configured": true, 00:28:57.652 "data_offset": 256, 00:28:57.652 "data_size": 7936 00:28:57.652 } 00:28:57.652 ] 00:28:57.652 } 00:28:57.652 } 00:28:57.652 }' 00:28:57.652 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:57.652 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:57.652 BaseBdev2' 00:28:57.652 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:57.652 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:57.652 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:57.910 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:57.910 "name": "BaseBdev1", 00:28:57.910 "aliases": [ 00:28:57.910 "503e334d-96df-4322-be90-c495b87dec80" 00:28:57.910 ], 00:28:57.910 "product_name": "Malloc disk", 00:28:57.910 "block_size": 4128, 00:28:57.910 "num_blocks": 8192, 00:28:57.910 "uuid": "503e334d-96df-4322-be90-c495b87dec80", 00:28:57.910 "md_size": 32, 00:28:57.910 "md_interleave": true, 00:28:57.910 "dif_type": 0, 00:28:57.910 "assigned_rate_limits": { 00:28:57.910 "rw_ios_per_sec": 0, 00:28:57.910 "rw_mbytes_per_sec": 0, 00:28:57.910 "r_mbytes_per_sec": 0, 00:28:57.910 "w_mbytes_per_sec": 0 00:28:57.910 }, 00:28:57.910 "claimed": true, 00:28:57.910 "claim_type": "exclusive_write", 00:28:57.910 "zoned": false, 00:28:57.910 "supported_io_types": { 00:28:57.910 "read": true, 00:28:57.910 "write": true, 00:28:57.910 "unmap": true, 00:28:57.910 "flush": true, 00:28:57.910 "reset": true, 00:28:57.910 "nvme_admin": false, 00:28:57.910 "nvme_io": false, 00:28:57.910 "nvme_io_md": false, 00:28:57.910 "write_zeroes": true, 00:28:57.910 "zcopy": true, 00:28:57.910 "get_zone_info": false, 00:28:57.910 "zone_management": false, 00:28:57.910 "zone_append": false, 00:28:57.910 "compare": false, 00:28:57.910 "compare_and_write": false, 00:28:57.910 "abort": true, 00:28:57.910 "seek_hole": false, 00:28:57.910 "seek_data": false, 00:28:57.910 "copy": true, 00:28:57.910 "nvme_iov_md": false 00:28:57.910 }, 00:28:57.910 "memory_domains": [ 00:28:57.910 { 00:28:57.910 "dma_device_id": "system", 00:28:57.910 "dma_device_type": 1 00:28:57.910 }, 00:28:57.910 { 00:28:57.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:57.910 "dma_device_type": 2 00:28:57.910 } 00:28:57.910 ], 00:28:57.910 "driver_specific": {} 00:28:57.910 }' 00:28:57.910 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:57.910 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:57.910 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:57.910 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:57.910 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:58.167 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:58.167 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:58.167 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:58.167 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:58.167 00:23:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:58.167 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:58.167 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:58.167 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:58.168 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:58.168 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:58.425 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:58.425 "name": "BaseBdev2", 00:28:58.425 "aliases": [ 00:28:58.425 "9c386ce5-21f4-4f60-a371-17928c54e346" 00:28:58.425 ], 00:28:58.425 "product_name": "Malloc disk", 00:28:58.425 "block_size": 4128, 00:28:58.425 "num_blocks": 8192, 00:28:58.425 "uuid": "9c386ce5-21f4-4f60-a371-17928c54e346", 00:28:58.425 "md_size": 32, 00:28:58.425 "md_interleave": true, 00:28:58.425 "dif_type": 0, 00:28:58.425 "assigned_rate_limits": { 00:28:58.425 "rw_ios_per_sec": 0, 00:28:58.425 "rw_mbytes_per_sec": 0, 00:28:58.425 "r_mbytes_per_sec": 0, 00:28:58.425 "w_mbytes_per_sec": 0 00:28:58.425 }, 00:28:58.425 "claimed": true, 00:28:58.425 "claim_type": "exclusive_write", 00:28:58.425 "zoned": false, 00:28:58.425 "supported_io_types": { 00:28:58.425 "read": true, 00:28:58.425 "write": true, 00:28:58.425 "unmap": true, 00:28:58.425 "flush": true, 00:28:58.425 "reset": true, 00:28:58.425 "nvme_admin": false, 00:28:58.425 "nvme_io": false, 00:28:58.425 "nvme_io_md": false, 00:28:58.425 "write_zeroes": true, 00:28:58.425 "zcopy": true, 00:28:58.425 "get_zone_info": false, 00:28:58.425 "zone_management": false, 00:28:58.425 "zone_append": false, 00:28:58.425 "compare": false, 00:28:58.425 "compare_and_write": false, 00:28:58.425 "abort": true, 00:28:58.425 "seek_hole": false, 00:28:58.425 "seek_data": false, 00:28:58.425 "copy": true, 00:28:58.425 "nvme_iov_md": false 00:28:58.425 }, 00:28:58.425 "memory_domains": [ 00:28:58.425 { 00:28:58.425 "dma_device_id": "system", 00:28:58.425 "dma_device_type": 1 00:28:58.425 }, 00:28:58.425 { 00:28:58.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:58.425 "dma_device_type": 2 00:28:58.425 } 00:28:58.425 ], 00:28:58.425 "driver_specific": {} 00:28:58.425 }' 00:28:58.425 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:58.683 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:58.683 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:58.683 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:58.683 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:58.683 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:58.683 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:58.683 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:58.683 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:58.683 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:58.940 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:58.940 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:58.940 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:59.198 [2024-07-16 00:23:45.933101] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:59.198 00:23:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.455 00:23:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:59.455 "name": "Existed_Raid", 00:28:59.455 "uuid": "187781d8-5984-4ef8-8ee0-71a41cdeedb6", 00:28:59.455 "strip_size_kb": 0, 00:28:59.455 "state": "online", 00:28:59.455 "raid_level": "raid1", 00:28:59.455 "superblock": true, 00:28:59.455 "num_base_bdevs": 2, 00:28:59.455 "num_base_bdevs_discovered": 1, 00:28:59.455 "num_base_bdevs_operational": 1, 00:28:59.455 "base_bdevs_list": [ 00:28:59.455 { 00:28:59.455 "name": null, 00:28:59.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:59.455 "is_configured": false, 00:28:59.455 "data_offset": 256, 00:28:59.455 "data_size": 7936 00:28:59.455 }, 00:28:59.455 { 00:28:59.455 "name": "BaseBdev2", 00:28:59.455 "uuid": "9c386ce5-21f4-4f60-a371-17928c54e346", 00:28:59.455 "is_configured": true, 00:28:59.455 "data_offset": 256, 00:28:59.455 "data_size": 7936 00:28:59.455 } 00:28:59.455 ] 00:28:59.455 }' 00:28:59.455 00:23:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:59.455 00:23:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:00.389 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:00.389 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:00.389 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.389 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:00.389 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:00.389 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:00.389 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:00.646 [2024-07-16 00:23:47.562516] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:00.646 [2024-07-16 00:23:47.562607] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:00.646 [2024-07-16 00:23:47.575515] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:00.646 [2024-07-16 00:23:47.575556] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:00.646 [2024-07-16 00:23:47.575568] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2317180 name Existed_Raid, state offline 00:29:00.646 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:00.903 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:00.903 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.903 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 3647968 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 3647968 ']' 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 3647968 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3647968 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3647968' 00:29:01.161 killing process with pid 3647968 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 3647968 00:29:01.161 [2024-07-16 00:23:47.910417] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:01.161 00:23:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 3647968 00:29:01.161 [2024-07-16 00:23:47.911326] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:01.419 00:23:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:29:01.419 00:29:01.419 real 0m11.104s 00:29:01.419 user 0m19.778s 00:29:01.419 sys 0m2.087s 00:29:01.419 00:23:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:01.419 00:23:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:01.419 ************************************ 00:29:01.419 END TEST raid_state_function_test_sb_md_interleaved 00:29:01.419 ************************************ 00:29:01.419 00:23:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:01.419 00:23:48 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:29:01.419 00:23:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:29:01.419 00:23:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:01.419 00:23:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:01.419 ************************************ 00:29:01.419 START TEST raid_superblock_test_md_interleaved 00:29:01.419 ************************************ 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=3649596 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 3649596 /var/tmp/spdk-raid.sock 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 3649596 ']' 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:01.419 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:01.419 00:23:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:01.419 [2024-07-16 00:23:48.286590] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:29:01.419 [2024-07-16 00:23:48.286655] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3649596 ] 00:29:01.678 [2024-07-16 00:23:48.406945] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:01.678 [2024-07-16 00:23:48.511182] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:01.678 [2024-07-16 00:23:48.572938] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:01.678 [2024-07-16 00:23:48.572975] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:29:02.614 malloc1 00:29:02.614 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:02.872 [2024-07-16 00:23:49.736718] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:02.872 [2024-07-16 00:23:49.736767] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:02.872 [2024-07-16 00:23:49.736787] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe8a4e0 00:29:02.872 [2024-07-16 00:23:49.736800] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:02.872 [2024-07-16 00:23:49.738191] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:02.872 [2024-07-16 00:23:49.738218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:02.872 pt1 00:29:02.872 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:02.872 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:02.872 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:29:02.872 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:29:02.872 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:02.872 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:02.872 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:02.872 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:02.872 00:23:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:29:03.131 malloc2 00:29:03.131 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:03.389 [2024-07-16 00:23:50.239060] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:03.389 [2024-07-16 00:23:50.239113] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:03.389 [2024-07-16 00:23:50.239135] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe6f570 00:29:03.389 [2024-07-16 00:23:50.239147] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:03.389 [2024-07-16 00:23:50.240679] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:03.389 [2024-07-16 00:23:50.240707] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:03.389 pt2 00:29:03.389 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:03.389 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:03.389 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:03.647 [2024-07-16 00:23:50.483715] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:03.647 [2024-07-16 00:23:50.485135] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:03.647 [2024-07-16 00:23:50.485295] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe70f20 00:29:03.647 [2024-07-16 00:23:50.485308] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:03.647 [2024-07-16 00:23:50.485383] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xced050 00:29:03.647 [2024-07-16 00:23:50.485469] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe70f20 00:29:03.647 [2024-07-16 00:23:50.485479] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe70f20 00:29:03.647 [2024-07-16 00:23:50.485538] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.647 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:03.905 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:03.905 "name": "raid_bdev1", 00:29:03.905 "uuid": "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b", 00:29:03.905 "strip_size_kb": 0, 00:29:03.905 "state": "online", 00:29:03.905 "raid_level": "raid1", 00:29:03.905 "superblock": true, 00:29:03.905 "num_base_bdevs": 2, 00:29:03.905 "num_base_bdevs_discovered": 2, 00:29:03.905 "num_base_bdevs_operational": 2, 00:29:03.905 "base_bdevs_list": [ 00:29:03.905 { 00:29:03.905 "name": "pt1", 00:29:03.905 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:03.905 "is_configured": true, 00:29:03.905 "data_offset": 256, 00:29:03.905 "data_size": 7936 00:29:03.905 }, 00:29:03.905 { 00:29:03.905 "name": "pt2", 00:29:03.905 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:03.905 "is_configured": true, 00:29:03.905 "data_offset": 256, 00:29:03.905 "data_size": 7936 00:29:03.905 } 00:29:03.905 ] 00:29:03.905 }' 00:29:03.905 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:03.905 00:23:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:04.469 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:29:04.469 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:04.469 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:04.469 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:04.469 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:04.469 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:04.469 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:04.469 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:04.726 [2024-07-16 00:23:51.639013] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:04.726 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:04.726 "name": "raid_bdev1", 00:29:04.726 "aliases": [ 00:29:04.726 "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b" 00:29:04.726 ], 00:29:04.726 "product_name": "Raid Volume", 00:29:04.726 "block_size": 4128, 00:29:04.726 "num_blocks": 7936, 00:29:04.726 "uuid": "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b", 00:29:04.726 "md_size": 32, 00:29:04.726 "md_interleave": true, 00:29:04.726 "dif_type": 0, 00:29:04.726 "assigned_rate_limits": { 00:29:04.726 "rw_ios_per_sec": 0, 00:29:04.726 "rw_mbytes_per_sec": 0, 00:29:04.726 "r_mbytes_per_sec": 0, 00:29:04.726 "w_mbytes_per_sec": 0 00:29:04.726 }, 00:29:04.726 "claimed": false, 00:29:04.726 "zoned": false, 00:29:04.726 "supported_io_types": { 00:29:04.726 "read": true, 00:29:04.726 "write": true, 00:29:04.726 "unmap": false, 00:29:04.726 "flush": false, 00:29:04.726 "reset": true, 00:29:04.726 "nvme_admin": false, 00:29:04.726 "nvme_io": false, 00:29:04.726 "nvme_io_md": false, 00:29:04.726 "write_zeroes": true, 00:29:04.726 "zcopy": false, 00:29:04.726 "get_zone_info": false, 00:29:04.726 "zone_management": false, 00:29:04.726 "zone_append": false, 00:29:04.726 "compare": false, 00:29:04.726 "compare_and_write": false, 00:29:04.726 "abort": false, 00:29:04.726 "seek_hole": false, 00:29:04.726 "seek_data": false, 00:29:04.726 "copy": false, 00:29:04.726 "nvme_iov_md": false 00:29:04.726 }, 00:29:04.726 "memory_domains": [ 00:29:04.726 { 00:29:04.726 "dma_device_id": "system", 00:29:04.726 "dma_device_type": 1 00:29:04.726 }, 00:29:04.726 { 00:29:04.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:04.726 "dma_device_type": 2 00:29:04.726 }, 00:29:04.726 { 00:29:04.726 "dma_device_id": "system", 00:29:04.726 "dma_device_type": 1 00:29:04.726 }, 00:29:04.726 { 00:29:04.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:04.726 "dma_device_type": 2 00:29:04.726 } 00:29:04.726 ], 00:29:04.726 "driver_specific": { 00:29:04.726 "raid": { 00:29:04.726 "uuid": "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b", 00:29:04.726 "strip_size_kb": 0, 00:29:04.726 "state": "online", 00:29:04.726 "raid_level": "raid1", 00:29:04.726 "superblock": true, 00:29:04.726 "num_base_bdevs": 2, 00:29:04.726 "num_base_bdevs_discovered": 2, 00:29:04.726 "num_base_bdevs_operational": 2, 00:29:04.726 "base_bdevs_list": [ 00:29:04.726 { 00:29:04.726 "name": "pt1", 00:29:04.726 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:04.726 "is_configured": true, 00:29:04.726 "data_offset": 256, 00:29:04.726 "data_size": 7936 00:29:04.726 }, 00:29:04.726 { 00:29:04.726 "name": "pt2", 00:29:04.726 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:04.726 "is_configured": true, 00:29:04.726 "data_offset": 256, 00:29:04.726 "data_size": 7936 00:29:04.726 } 00:29:04.726 ] 00:29:04.726 } 00:29:04.726 } 00:29:04.726 }' 00:29:04.726 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:04.984 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:04.984 pt2' 00:29:04.984 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:04.984 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:04.984 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:05.242 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:05.242 "name": "pt1", 00:29:05.242 "aliases": [ 00:29:05.242 "00000000-0000-0000-0000-000000000001" 00:29:05.242 ], 00:29:05.242 "product_name": "passthru", 00:29:05.242 "block_size": 4128, 00:29:05.242 "num_blocks": 8192, 00:29:05.242 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:05.242 "md_size": 32, 00:29:05.242 "md_interleave": true, 00:29:05.242 "dif_type": 0, 00:29:05.242 "assigned_rate_limits": { 00:29:05.242 "rw_ios_per_sec": 0, 00:29:05.242 "rw_mbytes_per_sec": 0, 00:29:05.242 "r_mbytes_per_sec": 0, 00:29:05.242 "w_mbytes_per_sec": 0 00:29:05.242 }, 00:29:05.242 "claimed": true, 00:29:05.242 "claim_type": "exclusive_write", 00:29:05.242 "zoned": false, 00:29:05.242 "supported_io_types": { 00:29:05.242 "read": true, 00:29:05.242 "write": true, 00:29:05.242 "unmap": true, 00:29:05.242 "flush": true, 00:29:05.242 "reset": true, 00:29:05.242 "nvme_admin": false, 00:29:05.242 "nvme_io": false, 00:29:05.242 "nvme_io_md": false, 00:29:05.242 "write_zeroes": true, 00:29:05.242 "zcopy": true, 00:29:05.242 "get_zone_info": false, 00:29:05.242 "zone_management": false, 00:29:05.242 "zone_append": false, 00:29:05.242 "compare": false, 00:29:05.242 "compare_and_write": false, 00:29:05.242 "abort": true, 00:29:05.242 "seek_hole": false, 00:29:05.242 "seek_data": false, 00:29:05.242 "copy": true, 00:29:05.242 "nvme_iov_md": false 00:29:05.242 }, 00:29:05.242 "memory_domains": [ 00:29:05.242 { 00:29:05.242 "dma_device_id": "system", 00:29:05.242 "dma_device_type": 1 00:29:05.242 }, 00:29:05.242 { 00:29:05.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:05.242 "dma_device_type": 2 00:29:05.242 } 00:29:05.242 ], 00:29:05.242 "driver_specific": { 00:29:05.242 "passthru": { 00:29:05.242 "name": "pt1", 00:29:05.242 "base_bdev_name": "malloc1" 00:29:05.242 } 00:29:05.242 } 00:29:05.242 }' 00:29:05.242 00:23:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:05.242 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:05.242 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:05.242 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:05.242 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:05.532 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:05.532 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:05.532 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:05.532 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:05.532 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:05.532 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:05.532 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:05.532 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:05.532 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:05.532 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:06.097 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:06.097 "name": "pt2", 00:29:06.097 "aliases": [ 00:29:06.097 "00000000-0000-0000-0000-000000000002" 00:29:06.097 ], 00:29:06.097 "product_name": "passthru", 00:29:06.097 "block_size": 4128, 00:29:06.097 "num_blocks": 8192, 00:29:06.097 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:06.097 "md_size": 32, 00:29:06.097 "md_interleave": true, 00:29:06.097 "dif_type": 0, 00:29:06.097 "assigned_rate_limits": { 00:29:06.097 "rw_ios_per_sec": 0, 00:29:06.097 "rw_mbytes_per_sec": 0, 00:29:06.097 "r_mbytes_per_sec": 0, 00:29:06.097 "w_mbytes_per_sec": 0 00:29:06.097 }, 00:29:06.097 "claimed": true, 00:29:06.097 "claim_type": "exclusive_write", 00:29:06.097 "zoned": false, 00:29:06.097 "supported_io_types": { 00:29:06.097 "read": true, 00:29:06.097 "write": true, 00:29:06.097 "unmap": true, 00:29:06.097 "flush": true, 00:29:06.097 "reset": true, 00:29:06.097 "nvme_admin": false, 00:29:06.097 "nvme_io": false, 00:29:06.097 "nvme_io_md": false, 00:29:06.097 "write_zeroes": true, 00:29:06.097 "zcopy": true, 00:29:06.097 "get_zone_info": false, 00:29:06.097 "zone_management": false, 00:29:06.097 "zone_append": false, 00:29:06.097 "compare": false, 00:29:06.097 "compare_and_write": false, 00:29:06.097 "abort": true, 00:29:06.097 "seek_hole": false, 00:29:06.097 "seek_data": false, 00:29:06.097 "copy": true, 00:29:06.097 "nvme_iov_md": false 00:29:06.097 }, 00:29:06.097 "memory_domains": [ 00:29:06.097 { 00:29:06.097 "dma_device_id": "system", 00:29:06.097 "dma_device_type": 1 00:29:06.097 }, 00:29:06.097 { 00:29:06.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:06.097 "dma_device_type": 2 00:29:06.097 } 00:29:06.097 ], 00:29:06.097 "driver_specific": { 00:29:06.097 "passthru": { 00:29:06.097 "name": "pt2", 00:29:06.097 "base_bdev_name": "malloc2" 00:29:06.097 } 00:29:06.097 } 00:29:06.097 }' 00:29:06.097 00:23:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:06.097 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:06.356 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:06.356 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:06.356 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:06.356 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:06.356 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:06.356 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:06.356 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:06.356 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:06.356 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:06.612 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:06.612 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:06.612 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:29:06.612 [2024-07-16 00:23:53.483867] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:06.612 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b 00:29:06.612 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b ']' 00:29:06.612 00:23:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:07.176 [2024-07-16 00:23:53.992976] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:07.177 [2024-07-16 00:23:53.992997] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:07.177 [2024-07-16 00:23:53.993050] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:07.177 [2024-07-16 00:23:53.993103] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:07.177 [2024-07-16 00:23:53.993114] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe70f20 name raid_bdev1, state offline 00:29:07.177 00:23:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.177 00:23:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:29:07.754 00:23:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:29:07.754 00:23:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:29:07.754 00:23:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:07.754 00:23:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:08.319 00:23:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:08.319 00:23:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:08.883 00:23:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:08.883 00:23:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:09.140 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:09.396 [2024-07-16 00:23:56.314999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:09.396 [2024-07-16 00:23:56.316414] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:09.396 [2024-07-16 00:23:56.316474] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:09.396 [2024-07-16 00:23:56.316515] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:09.396 [2024-07-16 00:23:56.316533] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:09.396 [2024-07-16 00:23:56.316543] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe7b260 name raid_bdev1, state configuring 00:29:09.396 request: 00:29:09.396 { 00:29:09.396 "name": "raid_bdev1", 00:29:09.396 "raid_level": "raid1", 00:29:09.396 "base_bdevs": [ 00:29:09.396 "malloc1", 00:29:09.396 "malloc2" 00:29:09.396 ], 00:29:09.396 "superblock": false, 00:29:09.396 "method": "bdev_raid_create", 00:29:09.396 "req_id": 1 00:29:09.396 } 00:29:09.396 Got JSON-RPC error response 00:29:09.396 response: 00:29:09.396 { 00:29:09.396 "code": -17, 00:29:09.396 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:09.396 } 00:29:09.396 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:09.396 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:09.396 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:09.396 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:09.396 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.396 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:29:09.962 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:29:09.962 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:29:09.962 00:23:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:10.219 [2024-07-16 00:23:57.125061] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:10.219 [2024-07-16 00:23:57.125104] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:10.219 [2024-07-16 00:23:57.125121] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe72000 00:29:10.219 [2024-07-16 00:23:57.125134] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:10.219 [2024-07-16 00:23:57.126542] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:10.219 [2024-07-16 00:23:57.126569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:10.219 [2024-07-16 00:23:57.126616] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:10.219 [2024-07-16 00:23:57.126641] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:10.219 pt1 00:29:10.219 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:10.219 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:10.219 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:10.219 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:10.220 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:10.220 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:10.220 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:10.220 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:10.220 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:10.220 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:10.220 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.220 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:10.785 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:10.785 "name": "raid_bdev1", 00:29:10.785 "uuid": "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b", 00:29:10.785 "strip_size_kb": 0, 00:29:10.785 "state": "configuring", 00:29:10.785 "raid_level": "raid1", 00:29:10.785 "superblock": true, 00:29:10.785 "num_base_bdevs": 2, 00:29:10.785 "num_base_bdevs_discovered": 1, 00:29:10.785 "num_base_bdevs_operational": 2, 00:29:10.785 "base_bdevs_list": [ 00:29:10.785 { 00:29:10.785 "name": "pt1", 00:29:10.785 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:10.785 "is_configured": true, 00:29:10.785 "data_offset": 256, 00:29:10.785 "data_size": 7936 00:29:10.785 }, 00:29:10.785 { 00:29:10.785 "name": null, 00:29:10.785 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:10.785 "is_configured": false, 00:29:10.785 "data_offset": 256, 00:29:10.785 "data_size": 7936 00:29:10.785 } 00:29:10.785 ] 00:29:10.785 }' 00:29:10.785 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:10.785 00:23:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:11.718 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:29:11.718 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:29:11.718 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:11.718 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:11.976 [2024-07-16 00:23:58.733352] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:11.976 [2024-07-16 00:23:58.733402] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:11.976 [2024-07-16 00:23:58.733428] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe74270 00:29:11.976 [2024-07-16 00:23:58.733442] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:11.976 [2024-07-16 00:23:58.733610] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:11.976 [2024-07-16 00:23:58.733626] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:11.976 [2024-07-16 00:23:58.733668] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:11.976 [2024-07-16 00:23:58.733685] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:11.976 [2024-07-16 00:23:58.733766] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcedc10 00:29:11.976 [2024-07-16 00:23:58.733776] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:11.976 [2024-07-16 00:23:58.733832] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe6fd40 00:29:11.976 [2024-07-16 00:23:58.733905] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcedc10 00:29:11.976 [2024-07-16 00:23:58.733914] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcedc10 00:29:11.976 [2024-07-16 00:23:58.733980] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:11.976 pt2 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.976 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:12.234 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:12.234 "name": "raid_bdev1", 00:29:12.234 "uuid": "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b", 00:29:12.234 "strip_size_kb": 0, 00:29:12.234 "state": "online", 00:29:12.234 "raid_level": "raid1", 00:29:12.234 "superblock": true, 00:29:12.234 "num_base_bdevs": 2, 00:29:12.234 "num_base_bdevs_discovered": 2, 00:29:12.234 "num_base_bdevs_operational": 2, 00:29:12.234 "base_bdevs_list": [ 00:29:12.234 { 00:29:12.234 "name": "pt1", 00:29:12.234 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:12.234 "is_configured": true, 00:29:12.234 "data_offset": 256, 00:29:12.234 "data_size": 7936 00:29:12.234 }, 00:29:12.234 { 00:29:12.234 "name": "pt2", 00:29:12.234 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:12.234 "is_configured": true, 00:29:12.234 "data_offset": 256, 00:29:12.234 "data_size": 7936 00:29:12.234 } 00:29:12.234 ] 00:29:12.234 }' 00:29:12.234 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:12.234 00:23:58 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:12.799 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:29:12.799 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:12.799 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:12.799 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:12.799 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:12.799 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:12.799 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:12.799 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:12.799 [2024-07-16 00:23:59.700182] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:12.799 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:12.799 "name": "raid_bdev1", 00:29:12.799 "aliases": [ 00:29:12.799 "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b" 00:29:12.799 ], 00:29:12.799 "product_name": "Raid Volume", 00:29:12.799 "block_size": 4128, 00:29:12.799 "num_blocks": 7936, 00:29:12.799 "uuid": "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b", 00:29:12.799 "md_size": 32, 00:29:12.799 "md_interleave": true, 00:29:12.799 "dif_type": 0, 00:29:12.799 "assigned_rate_limits": { 00:29:12.799 "rw_ios_per_sec": 0, 00:29:12.799 "rw_mbytes_per_sec": 0, 00:29:12.799 "r_mbytes_per_sec": 0, 00:29:12.799 "w_mbytes_per_sec": 0 00:29:12.799 }, 00:29:12.800 "claimed": false, 00:29:12.800 "zoned": false, 00:29:12.800 "supported_io_types": { 00:29:12.800 "read": true, 00:29:12.800 "write": true, 00:29:12.800 "unmap": false, 00:29:12.800 "flush": false, 00:29:12.800 "reset": true, 00:29:12.800 "nvme_admin": false, 00:29:12.800 "nvme_io": false, 00:29:12.800 "nvme_io_md": false, 00:29:12.800 "write_zeroes": true, 00:29:12.800 "zcopy": false, 00:29:12.800 "get_zone_info": false, 00:29:12.800 "zone_management": false, 00:29:12.800 "zone_append": false, 00:29:12.800 "compare": false, 00:29:12.800 "compare_and_write": false, 00:29:12.800 "abort": false, 00:29:12.800 "seek_hole": false, 00:29:12.800 "seek_data": false, 00:29:12.800 "copy": false, 00:29:12.800 "nvme_iov_md": false 00:29:12.800 }, 00:29:12.800 "memory_domains": [ 00:29:12.800 { 00:29:12.800 "dma_device_id": "system", 00:29:12.800 "dma_device_type": 1 00:29:12.800 }, 00:29:12.800 { 00:29:12.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:12.800 "dma_device_type": 2 00:29:12.800 }, 00:29:12.800 { 00:29:12.800 "dma_device_id": "system", 00:29:12.800 "dma_device_type": 1 00:29:12.800 }, 00:29:12.800 { 00:29:12.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:12.800 "dma_device_type": 2 00:29:12.800 } 00:29:12.800 ], 00:29:12.800 "driver_specific": { 00:29:12.800 "raid": { 00:29:12.800 "uuid": "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b", 00:29:12.800 "strip_size_kb": 0, 00:29:12.800 "state": "online", 00:29:12.800 "raid_level": "raid1", 00:29:12.800 "superblock": true, 00:29:12.800 "num_base_bdevs": 2, 00:29:12.800 "num_base_bdevs_discovered": 2, 00:29:12.800 "num_base_bdevs_operational": 2, 00:29:12.800 "base_bdevs_list": [ 00:29:12.800 { 00:29:12.800 "name": "pt1", 00:29:12.800 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:12.800 "is_configured": true, 00:29:12.800 "data_offset": 256, 00:29:12.800 "data_size": 7936 00:29:12.800 }, 00:29:12.800 { 00:29:12.800 "name": "pt2", 00:29:12.800 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:12.800 "is_configured": true, 00:29:12.800 "data_offset": 256, 00:29:12.800 "data_size": 7936 00:29:12.800 } 00:29:12.800 ] 00:29:12.800 } 00:29:12.800 } 00:29:12.800 }' 00:29:12.800 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:13.057 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:13.057 pt2' 00:29:13.057 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:13.057 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:13.057 00:23:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:13.314 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:13.314 "name": "pt1", 00:29:13.314 "aliases": [ 00:29:13.314 "00000000-0000-0000-0000-000000000001" 00:29:13.314 ], 00:29:13.314 "product_name": "passthru", 00:29:13.314 "block_size": 4128, 00:29:13.314 "num_blocks": 8192, 00:29:13.314 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:13.314 "md_size": 32, 00:29:13.314 "md_interleave": true, 00:29:13.314 "dif_type": 0, 00:29:13.314 "assigned_rate_limits": { 00:29:13.314 "rw_ios_per_sec": 0, 00:29:13.314 "rw_mbytes_per_sec": 0, 00:29:13.314 "r_mbytes_per_sec": 0, 00:29:13.314 "w_mbytes_per_sec": 0 00:29:13.314 }, 00:29:13.314 "claimed": true, 00:29:13.314 "claim_type": "exclusive_write", 00:29:13.314 "zoned": false, 00:29:13.314 "supported_io_types": { 00:29:13.314 "read": true, 00:29:13.314 "write": true, 00:29:13.315 "unmap": true, 00:29:13.315 "flush": true, 00:29:13.315 "reset": true, 00:29:13.315 "nvme_admin": false, 00:29:13.315 "nvme_io": false, 00:29:13.315 "nvme_io_md": false, 00:29:13.315 "write_zeroes": true, 00:29:13.315 "zcopy": true, 00:29:13.315 "get_zone_info": false, 00:29:13.315 "zone_management": false, 00:29:13.315 "zone_append": false, 00:29:13.315 "compare": false, 00:29:13.315 "compare_and_write": false, 00:29:13.315 "abort": true, 00:29:13.315 "seek_hole": false, 00:29:13.315 "seek_data": false, 00:29:13.315 "copy": true, 00:29:13.315 "nvme_iov_md": false 00:29:13.315 }, 00:29:13.315 "memory_domains": [ 00:29:13.315 { 00:29:13.315 "dma_device_id": "system", 00:29:13.315 "dma_device_type": 1 00:29:13.315 }, 00:29:13.315 { 00:29:13.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:13.315 "dma_device_type": 2 00:29:13.315 } 00:29:13.315 ], 00:29:13.315 "driver_specific": { 00:29:13.315 "passthru": { 00:29:13.315 "name": "pt1", 00:29:13.315 "base_bdev_name": "malloc1" 00:29:13.315 } 00:29:13.315 } 00:29:13.315 }' 00:29:13.315 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:13.315 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:13.315 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:13.315 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:13.315 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:13.315 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:13.315 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:13.315 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:13.573 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:13.573 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:13.573 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:13.573 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:13.573 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:13.573 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:13.573 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:13.832 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:13.832 "name": "pt2", 00:29:13.832 "aliases": [ 00:29:13.832 "00000000-0000-0000-0000-000000000002" 00:29:13.832 ], 00:29:13.832 "product_name": "passthru", 00:29:13.832 "block_size": 4128, 00:29:13.832 "num_blocks": 8192, 00:29:13.832 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:13.832 "md_size": 32, 00:29:13.832 "md_interleave": true, 00:29:13.832 "dif_type": 0, 00:29:13.832 "assigned_rate_limits": { 00:29:13.832 "rw_ios_per_sec": 0, 00:29:13.832 "rw_mbytes_per_sec": 0, 00:29:13.832 "r_mbytes_per_sec": 0, 00:29:13.832 "w_mbytes_per_sec": 0 00:29:13.832 }, 00:29:13.832 "claimed": true, 00:29:13.832 "claim_type": "exclusive_write", 00:29:13.832 "zoned": false, 00:29:13.832 "supported_io_types": { 00:29:13.832 "read": true, 00:29:13.832 "write": true, 00:29:13.832 "unmap": true, 00:29:13.832 "flush": true, 00:29:13.832 "reset": true, 00:29:13.832 "nvme_admin": false, 00:29:13.832 "nvme_io": false, 00:29:13.832 "nvme_io_md": false, 00:29:13.832 "write_zeroes": true, 00:29:13.832 "zcopy": true, 00:29:13.832 "get_zone_info": false, 00:29:13.832 "zone_management": false, 00:29:13.832 "zone_append": false, 00:29:13.832 "compare": false, 00:29:13.832 "compare_and_write": false, 00:29:13.832 "abort": true, 00:29:13.832 "seek_hole": false, 00:29:13.832 "seek_data": false, 00:29:13.832 "copy": true, 00:29:13.832 "nvme_iov_md": false 00:29:13.832 }, 00:29:13.832 "memory_domains": [ 00:29:13.832 { 00:29:13.832 "dma_device_id": "system", 00:29:13.832 "dma_device_type": 1 00:29:13.832 }, 00:29:13.832 { 00:29:13.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:13.832 "dma_device_type": 2 00:29:13.832 } 00:29:13.832 ], 00:29:13.832 "driver_specific": { 00:29:13.832 "passthru": { 00:29:13.832 "name": "pt2", 00:29:13.832 "base_bdev_name": "malloc2" 00:29:13.832 } 00:29:13.832 } 00:29:13.832 }' 00:29:13.832 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:13.832 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:13.832 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:13.832 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:13.832 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:14.090 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:14.090 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:14.090 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:14.090 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:14.090 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:14.090 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:14.090 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:14.090 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:14.090 00:24:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:29:14.349 [2024-07-16 00:24:01.220250] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:14.349 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b '!=' f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b ']' 00:29:14.349 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:29:14.349 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:14.349 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:14.349 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:14.914 [2024-07-16 00:24:01.725349] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:14.914 00:24:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:15.171 00:24:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:15.171 "name": "raid_bdev1", 00:29:15.171 "uuid": "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b", 00:29:15.171 "strip_size_kb": 0, 00:29:15.171 "state": "online", 00:29:15.171 "raid_level": "raid1", 00:29:15.172 "superblock": true, 00:29:15.172 "num_base_bdevs": 2, 00:29:15.172 "num_base_bdevs_discovered": 1, 00:29:15.172 "num_base_bdevs_operational": 1, 00:29:15.172 "base_bdevs_list": [ 00:29:15.172 { 00:29:15.172 "name": null, 00:29:15.172 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:15.172 "is_configured": false, 00:29:15.172 "data_offset": 256, 00:29:15.172 "data_size": 7936 00:29:15.172 }, 00:29:15.172 { 00:29:15.172 "name": "pt2", 00:29:15.172 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:15.172 "is_configured": true, 00:29:15.172 "data_offset": 256, 00:29:15.172 "data_size": 7936 00:29:15.172 } 00:29:15.172 ] 00:29:15.172 }' 00:29:15.172 00:24:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:15.172 00:24:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:15.736 00:24:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:16.301 [2024-07-16 00:24:03.100975] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:16.301 [2024-07-16 00:24:03.101006] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:16.301 [2024-07-16 00:24:03.101069] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:16.301 [2024-07-16 00:24:03.101113] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:16.301 [2024-07-16 00:24:03.101125] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcedc10 name raid_bdev1, state offline 00:29:16.301 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.301 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:29:16.559 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:29:16.559 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:29:16.559 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:29:16.559 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:16.559 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:16.817 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:29:16.817 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:16.817 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:29:16.817 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:29:16.817 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:29:16.817 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:17.075 [2024-07-16 00:24:03.838879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:17.075 [2024-07-16 00:24:03.838940] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:17.075 [2024-07-16 00:24:03.838961] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe729f0 00:29:17.075 [2024-07-16 00:24:03.838973] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:17.075 [2024-07-16 00:24:03.840399] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:17.075 [2024-07-16 00:24:03.840427] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:17.075 [2024-07-16 00:24:03.840477] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:17.075 [2024-07-16 00:24:03.840502] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:17.075 [2024-07-16 00:24:03.840571] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe73ea0 00:29:17.075 [2024-07-16 00:24:03.840582] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:17.075 [2024-07-16 00:24:03.840645] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe71bc0 00:29:17.075 [2024-07-16 00:24:03.840717] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe73ea0 00:29:17.075 [2024-07-16 00:24:03.840727] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe73ea0 00:29:17.075 [2024-07-16 00:24:03.840780] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:17.075 pt2 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.075 00:24:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.332 00:24:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.332 "name": "raid_bdev1", 00:29:17.332 "uuid": "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b", 00:29:17.332 "strip_size_kb": 0, 00:29:17.332 "state": "online", 00:29:17.332 "raid_level": "raid1", 00:29:17.332 "superblock": true, 00:29:17.332 "num_base_bdevs": 2, 00:29:17.332 "num_base_bdevs_discovered": 1, 00:29:17.332 "num_base_bdevs_operational": 1, 00:29:17.332 "base_bdevs_list": [ 00:29:17.332 { 00:29:17.332 "name": null, 00:29:17.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:17.332 "is_configured": false, 00:29:17.332 "data_offset": 256, 00:29:17.332 "data_size": 7936 00:29:17.332 }, 00:29:17.332 { 00:29:17.332 "name": "pt2", 00:29:17.332 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:17.332 "is_configured": true, 00:29:17.332 "data_offset": 256, 00:29:17.332 "data_size": 7936 00:29:17.332 } 00:29:17.332 ] 00:29:17.332 }' 00:29:17.332 00:24:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.332 00:24:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:17.896 00:24:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:18.154 [2024-07-16 00:24:04.933759] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:18.154 [2024-07-16 00:24:04.933790] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:18.154 [2024-07-16 00:24:04.933851] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:18.154 [2024-07-16 00:24:04.933896] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:18.154 [2024-07-16 00:24:04.933908] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe73ea0 name raid_bdev1, state offline 00:29:18.154 00:24:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.154 00:24:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:29:18.411 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:29:18.411 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:29:18.411 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:29:18.411 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:18.988 [2024-07-16 00:24:05.679697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:18.988 [2024-07-16 00:24:05.679756] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:18.988 [2024-07-16 00:24:05.679776] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe72620 00:29:18.988 [2024-07-16 00:24:05.679789] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:18.988 [2024-07-16 00:24:05.681266] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:18.988 [2024-07-16 00:24:05.681293] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:18.988 [2024-07-16 00:24:05.681345] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:18.988 [2024-07-16 00:24:05.681372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:18.988 [2024-07-16 00:24:05.681453] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:18.988 [2024-07-16 00:24:05.681466] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:18.988 [2024-07-16 00:24:05.681481] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe74640 name raid_bdev1, state configuring 00:29:18.988 [2024-07-16 00:24:05.681504] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:18.988 [2024-07-16 00:24:05.681556] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe74640 00:29:18.988 [2024-07-16 00:24:05.681567] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:18.988 [2024-07-16 00:24:05.681624] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe73810 00:29:18.988 [2024-07-16 00:24:05.681695] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe74640 00:29:18.988 [2024-07-16 00:24:05.681704] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe74640 00:29:18.988 [2024-07-16 00:24:05.681763] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:18.988 pt1 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.988 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:19.245 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:19.245 "name": "raid_bdev1", 00:29:19.245 "uuid": "f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b", 00:29:19.246 "strip_size_kb": 0, 00:29:19.246 "state": "online", 00:29:19.246 "raid_level": "raid1", 00:29:19.246 "superblock": true, 00:29:19.246 "num_base_bdevs": 2, 00:29:19.246 "num_base_bdevs_discovered": 1, 00:29:19.246 "num_base_bdevs_operational": 1, 00:29:19.246 "base_bdevs_list": [ 00:29:19.246 { 00:29:19.246 "name": null, 00:29:19.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:19.246 "is_configured": false, 00:29:19.246 "data_offset": 256, 00:29:19.246 "data_size": 7936 00:29:19.246 }, 00:29:19.246 { 00:29:19.246 "name": "pt2", 00:29:19.246 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:19.246 "is_configured": true, 00:29:19.246 "data_offset": 256, 00:29:19.246 "data_size": 7936 00:29:19.246 } 00:29:19.246 ] 00:29:19.246 }' 00:29:19.246 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:19.246 00:24:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:19.810 00:24:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:19.810 00:24:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:20.089 00:24:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:29:20.089 00:24:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:20.089 00:24:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:29:20.346 [2024-07-16 00:24:07.047556] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b '!=' f6cfd5a1-88bc-41df-bbf9-d15a54a4e37b ']' 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 3649596 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 3649596 ']' 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 3649596 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3649596 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3649596' 00:29:20.346 killing process with pid 3649596 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 3649596 00:29:20.346 [2024-07-16 00:24:07.120318] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:20.346 [2024-07-16 00:24:07.120372] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:20.346 [2024-07-16 00:24:07.120414] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:20.346 [2024-07-16 00:24:07.120425] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe74640 name raid_bdev1, state offline 00:29:20.346 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 3649596 00:29:20.346 [2024-07-16 00:24:07.139786] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:20.604 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:29:20.604 00:29:20.604 real 0m19.142s 00:29:20.604 user 0m34.922s 00:29:20.604 sys 0m3.328s 00:29:20.604 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:20.604 00:24:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:20.604 ************************************ 00:29:20.604 END TEST raid_superblock_test_md_interleaved 00:29:20.604 ************************************ 00:29:20.604 00:24:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:20.604 00:24:07 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:29:20.604 00:24:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:20.604 00:24:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:20.604 00:24:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:20.604 ************************************ 00:29:20.604 START TEST raid_rebuild_test_sb_md_interleaved 00:29:20.605 ************************************ 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=3652866 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 3652866 /var/tmp/spdk-raid.sock 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 3652866 ']' 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:20.605 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:20.605 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:20.605 [2024-07-16 00:24:07.518297] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:29:20.605 [2024-07-16 00:24:07.518359] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3652866 ] 00:29:20.605 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:20.605 Zero copy mechanism will not be used. 00:29:20.865 [2024-07-16 00:24:07.648031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:20.865 [2024-07-16 00:24:07.756495] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:21.124 [2024-07-16 00:24:07.823501] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:21.124 [2024-07-16 00:24:07.823531] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:21.124 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:21.124 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:21.124 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:21.124 00:24:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:29:21.382 BaseBdev1_malloc 00:29:21.382 00:24:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:21.640 [2024-07-16 00:24:08.456397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:21.640 [2024-07-16 00:24:08.456446] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:21.640 [2024-07-16 00:24:08.456469] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1748ce0 00:29:21.640 [2024-07-16 00:24:08.456482] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:21.640 [2024-07-16 00:24:08.458024] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:21.640 [2024-07-16 00:24:08.458053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:21.640 BaseBdev1 00:29:21.640 00:24:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:21.640 00:24:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:29:21.898 BaseBdev2_malloc 00:29:21.898 00:24:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:22.156 [2024-07-16 00:24:08.950853] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:22.156 [2024-07-16 00:24:08.950900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:22.156 [2024-07-16 00:24:08.950924] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17402d0 00:29:22.156 [2024-07-16 00:24:08.950943] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:22.156 [2024-07-16 00:24:08.952683] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:22.156 [2024-07-16 00:24:08.952716] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:22.156 BaseBdev2 00:29:22.156 00:24:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:29:22.414 spare_malloc 00:29:22.414 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:22.672 spare_delay 00:29:22.672 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:22.930 [2024-07-16 00:24:09.689678] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:22.930 [2024-07-16 00:24:09.689728] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:22.930 [2024-07-16 00:24:09.689752] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1743070 00:29:22.930 [2024-07-16 00:24:09.689765] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:22.930 [2024-07-16 00:24:09.691233] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:22.930 [2024-07-16 00:24:09.691261] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:22.930 spare 00:29:22.930 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:23.187 [2024-07-16 00:24:09.926329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:23.187 [2024-07-16 00:24:09.927680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:23.187 [2024-07-16 00:24:09.927849] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1745370 00:29:23.188 [2024-07-16 00:24:09.927862] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:23.188 [2024-07-16 00:24:09.927944] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15ab9c0 00:29:23.188 [2024-07-16 00:24:09.928027] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1745370 00:29:23.188 [2024-07-16 00:24:09.928037] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1745370 00:29:23.188 [2024-07-16 00:24:09.928096] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.188 00:24:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:23.446 00:24:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:23.446 "name": "raid_bdev1", 00:29:23.446 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:23.446 "strip_size_kb": 0, 00:29:23.446 "state": "online", 00:29:23.446 "raid_level": "raid1", 00:29:23.446 "superblock": true, 00:29:23.446 "num_base_bdevs": 2, 00:29:23.446 "num_base_bdevs_discovered": 2, 00:29:23.446 "num_base_bdevs_operational": 2, 00:29:23.446 "base_bdevs_list": [ 00:29:23.446 { 00:29:23.446 "name": "BaseBdev1", 00:29:23.446 "uuid": "f768ef05-18c8-59f1-bc62-cb9ee0ed06a6", 00:29:23.446 "is_configured": true, 00:29:23.446 "data_offset": 256, 00:29:23.446 "data_size": 7936 00:29:23.446 }, 00:29:23.446 { 00:29:23.446 "name": "BaseBdev2", 00:29:23.446 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:23.446 "is_configured": true, 00:29:23.446 "data_offset": 256, 00:29:23.446 "data_size": 7936 00:29:23.446 } 00:29:23.446 ] 00:29:23.446 }' 00:29:23.446 00:24:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:23.446 00:24:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:24.011 00:24:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:24.011 00:24:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:29:24.269 [2024-07-16 00:24:11.021470] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:24.269 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:29:24.269 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.269 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:24.527 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:29:24.527 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:29:24.527 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:29:24.527 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:24.785 [2024-07-16 00:24:11.530555] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.785 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:25.043 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:25.043 "name": "raid_bdev1", 00:29:25.043 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:25.043 "strip_size_kb": 0, 00:29:25.043 "state": "online", 00:29:25.043 "raid_level": "raid1", 00:29:25.043 "superblock": true, 00:29:25.043 "num_base_bdevs": 2, 00:29:25.043 "num_base_bdevs_discovered": 1, 00:29:25.043 "num_base_bdevs_operational": 1, 00:29:25.043 "base_bdevs_list": [ 00:29:25.043 { 00:29:25.043 "name": null, 00:29:25.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:25.043 "is_configured": false, 00:29:25.043 "data_offset": 256, 00:29:25.043 "data_size": 7936 00:29:25.043 }, 00:29:25.043 { 00:29:25.043 "name": "BaseBdev2", 00:29:25.043 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:25.043 "is_configured": true, 00:29:25.043 "data_offset": 256, 00:29:25.043 "data_size": 7936 00:29:25.043 } 00:29:25.043 ] 00:29:25.043 }' 00:29:25.043 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:25.043 00:24:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:25.608 00:24:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:25.866 [2024-07-16 00:24:12.801940] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:25.866 [2024-07-16 00:24:12.805528] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1745250 00:29:25.866 [2024-07-16 00:24:12.807530] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:26.124 00:24:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:27.057 00:24:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:27.057 00:24:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:27.057 00:24:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:27.057 00:24:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:27.057 00:24:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:27.057 00:24:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.057 00:24:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:27.316 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:27.316 "name": "raid_bdev1", 00:29:27.316 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:27.316 "strip_size_kb": 0, 00:29:27.316 "state": "online", 00:29:27.316 "raid_level": "raid1", 00:29:27.316 "superblock": true, 00:29:27.316 "num_base_bdevs": 2, 00:29:27.316 "num_base_bdevs_discovered": 2, 00:29:27.316 "num_base_bdevs_operational": 2, 00:29:27.316 "process": { 00:29:27.316 "type": "rebuild", 00:29:27.316 "target": "spare", 00:29:27.316 "progress": { 00:29:27.316 "blocks": 3072, 00:29:27.316 "percent": 38 00:29:27.316 } 00:29:27.316 }, 00:29:27.316 "base_bdevs_list": [ 00:29:27.316 { 00:29:27.316 "name": "spare", 00:29:27.316 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:27.316 "is_configured": true, 00:29:27.316 "data_offset": 256, 00:29:27.316 "data_size": 7936 00:29:27.316 }, 00:29:27.316 { 00:29:27.316 "name": "BaseBdev2", 00:29:27.316 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:27.316 "is_configured": true, 00:29:27.316 "data_offset": 256, 00:29:27.316 "data_size": 7936 00:29:27.316 } 00:29:27.316 ] 00:29:27.316 }' 00:29:27.316 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:27.316 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:27.316 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:27.316 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:27.316 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:27.574 [2024-07-16 00:24:14.392561] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:27.574 [2024-07-16 00:24:14.420108] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:27.574 [2024-07-16 00:24:14.420157] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:27.574 [2024-07-16 00:24:14.420172] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:27.574 [2024-07-16 00:24:14.420181] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.574 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:27.832 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:27.832 "name": "raid_bdev1", 00:29:27.832 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:27.832 "strip_size_kb": 0, 00:29:27.832 "state": "online", 00:29:27.832 "raid_level": "raid1", 00:29:27.832 "superblock": true, 00:29:27.832 "num_base_bdevs": 2, 00:29:27.832 "num_base_bdevs_discovered": 1, 00:29:27.832 "num_base_bdevs_operational": 1, 00:29:27.832 "base_bdevs_list": [ 00:29:27.832 { 00:29:27.832 "name": null, 00:29:27.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:27.832 "is_configured": false, 00:29:27.832 "data_offset": 256, 00:29:27.832 "data_size": 7936 00:29:27.832 }, 00:29:27.832 { 00:29:27.832 "name": "BaseBdev2", 00:29:27.832 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:27.832 "is_configured": true, 00:29:27.832 "data_offset": 256, 00:29:27.832 "data_size": 7936 00:29:27.832 } 00:29:27.832 ] 00:29:27.832 }' 00:29:27.832 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:27.832 00:24:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:28.398 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:28.398 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:28.398 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:28.398 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:28.398 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:28.398 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:28.398 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:28.656 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:28.656 "name": "raid_bdev1", 00:29:28.656 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:28.656 "strip_size_kb": 0, 00:29:28.656 "state": "online", 00:29:28.656 "raid_level": "raid1", 00:29:28.656 "superblock": true, 00:29:28.656 "num_base_bdevs": 2, 00:29:28.656 "num_base_bdevs_discovered": 1, 00:29:28.656 "num_base_bdevs_operational": 1, 00:29:28.656 "base_bdevs_list": [ 00:29:28.656 { 00:29:28.656 "name": null, 00:29:28.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:28.656 "is_configured": false, 00:29:28.656 "data_offset": 256, 00:29:28.656 "data_size": 7936 00:29:28.656 }, 00:29:28.656 { 00:29:28.656 "name": "BaseBdev2", 00:29:28.656 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:28.656 "is_configured": true, 00:29:28.656 "data_offset": 256, 00:29:28.656 "data_size": 7936 00:29:28.656 } 00:29:28.656 ] 00:29:28.656 }' 00:29:28.656 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:28.656 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:28.656 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:28.656 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:28.656 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:28.914 [2024-07-16 00:24:15.707950] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:28.914 [2024-07-16 00:24:15.711551] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1741270 00:29:28.914 [2024-07-16 00:24:15.712975] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:28.914 00:24:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:29.847 00:24:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:29.847 00:24:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:29.847 00:24:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:29.847 00:24:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:29.847 00:24:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:29.847 00:24:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:29.847 00:24:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:30.105 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:30.105 "name": "raid_bdev1", 00:29:30.105 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:30.105 "strip_size_kb": 0, 00:29:30.105 "state": "online", 00:29:30.105 "raid_level": "raid1", 00:29:30.105 "superblock": true, 00:29:30.105 "num_base_bdevs": 2, 00:29:30.105 "num_base_bdevs_discovered": 2, 00:29:30.105 "num_base_bdevs_operational": 2, 00:29:30.105 "process": { 00:29:30.105 "type": "rebuild", 00:29:30.105 "target": "spare", 00:29:30.105 "progress": { 00:29:30.105 "blocks": 3072, 00:29:30.105 "percent": 38 00:29:30.105 } 00:29:30.105 }, 00:29:30.105 "base_bdevs_list": [ 00:29:30.105 { 00:29:30.105 "name": "spare", 00:29:30.105 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:30.105 "is_configured": true, 00:29:30.105 "data_offset": 256, 00:29:30.105 "data_size": 7936 00:29:30.105 }, 00:29:30.105 { 00:29:30.105 "name": "BaseBdev2", 00:29:30.105 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:30.105 "is_configured": true, 00:29:30.105 "data_offset": 256, 00:29:30.105 "data_size": 7936 00:29:30.105 } 00:29:30.105 ] 00:29:30.105 }' 00:29:30.105 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:30.105 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:29:30.364 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1167 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:30.364 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:30.622 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:30.622 "name": "raid_bdev1", 00:29:30.622 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:30.622 "strip_size_kb": 0, 00:29:30.622 "state": "online", 00:29:30.622 "raid_level": "raid1", 00:29:30.622 "superblock": true, 00:29:30.622 "num_base_bdevs": 2, 00:29:30.622 "num_base_bdevs_discovered": 2, 00:29:30.622 "num_base_bdevs_operational": 2, 00:29:30.622 "process": { 00:29:30.622 "type": "rebuild", 00:29:30.622 "target": "spare", 00:29:30.622 "progress": { 00:29:30.622 "blocks": 4096, 00:29:30.622 "percent": 51 00:29:30.622 } 00:29:30.622 }, 00:29:30.622 "base_bdevs_list": [ 00:29:30.622 { 00:29:30.622 "name": "spare", 00:29:30.622 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:30.622 "is_configured": true, 00:29:30.622 "data_offset": 256, 00:29:30.622 "data_size": 7936 00:29:30.622 }, 00:29:30.622 { 00:29:30.622 "name": "BaseBdev2", 00:29:30.622 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:30.622 "is_configured": true, 00:29:30.622 "data_offset": 256, 00:29:30.622 "data_size": 7936 00:29:30.622 } 00:29:30.622 ] 00:29:30.622 }' 00:29:30.622 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:30.622 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:30.622 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:30.622 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:30.622 00:24:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:31.555 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:31.555 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:31.555 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:31.555 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:31.555 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:31.555 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:31.555 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:31.555 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:31.813 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:31.813 "name": "raid_bdev1", 00:29:31.813 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:31.813 "strip_size_kb": 0, 00:29:31.813 "state": "online", 00:29:31.813 "raid_level": "raid1", 00:29:31.813 "superblock": true, 00:29:31.813 "num_base_bdevs": 2, 00:29:31.813 "num_base_bdevs_discovered": 2, 00:29:31.813 "num_base_bdevs_operational": 2, 00:29:31.813 "process": { 00:29:31.813 "type": "rebuild", 00:29:31.813 "target": "spare", 00:29:31.813 "progress": { 00:29:31.813 "blocks": 7424, 00:29:31.813 "percent": 93 00:29:31.813 } 00:29:31.813 }, 00:29:31.813 "base_bdevs_list": [ 00:29:31.813 { 00:29:31.813 "name": "spare", 00:29:31.813 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:31.813 "is_configured": true, 00:29:31.813 "data_offset": 256, 00:29:31.813 "data_size": 7936 00:29:31.813 }, 00:29:31.813 { 00:29:31.813 "name": "BaseBdev2", 00:29:31.813 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:31.813 "is_configured": true, 00:29:31.813 "data_offset": 256, 00:29:31.813 "data_size": 7936 00:29:31.813 } 00:29:31.813 ] 00:29:31.813 }' 00:29:31.813 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:31.813 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:31.813 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:32.071 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:32.071 00:24:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:32.071 [2024-07-16 00:24:18.836305] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:32.071 [2024-07-16 00:24:18.836359] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:32.071 [2024-07-16 00:24:18.836445] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:33.004 00:24:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:33.004 00:24:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:33.004 00:24:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:33.004 00:24:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:33.004 00:24:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:33.004 00:24:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:33.004 00:24:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.004 00:24:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:33.262 "name": "raid_bdev1", 00:29:33.262 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:33.262 "strip_size_kb": 0, 00:29:33.262 "state": "online", 00:29:33.262 "raid_level": "raid1", 00:29:33.262 "superblock": true, 00:29:33.262 "num_base_bdevs": 2, 00:29:33.262 "num_base_bdevs_discovered": 2, 00:29:33.262 "num_base_bdevs_operational": 2, 00:29:33.262 "base_bdevs_list": [ 00:29:33.262 { 00:29:33.262 "name": "spare", 00:29:33.262 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:33.262 "is_configured": true, 00:29:33.262 "data_offset": 256, 00:29:33.262 "data_size": 7936 00:29:33.262 }, 00:29:33.262 { 00:29:33.262 "name": "BaseBdev2", 00:29:33.262 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:33.262 "is_configured": true, 00:29:33.262 "data_offset": 256, 00:29:33.262 "data_size": 7936 00:29:33.262 } 00:29:33.262 ] 00:29:33.262 }' 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:33.262 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.518 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:33.518 "name": "raid_bdev1", 00:29:33.518 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:33.518 "strip_size_kb": 0, 00:29:33.518 "state": "online", 00:29:33.518 "raid_level": "raid1", 00:29:33.518 "superblock": true, 00:29:33.518 "num_base_bdevs": 2, 00:29:33.518 "num_base_bdevs_discovered": 2, 00:29:33.518 "num_base_bdevs_operational": 2, 00:29:33.518 "base_bdevs_list": [ 00:29:33.518 { 00:29:33.518 "name": "spare", 00:29:33.518 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:33.518 "is_configured": true, 00:29:33.518 "data_offset": 256, 00:29:33.518 "data_size": 7936 00:29:33.518 }, 00:29:33.518 { 00:29:33.518 "name": "BaseBdev2", 00:29:33.518 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:33.518 "is_configured": true, 00:29:33.518 "data_offset": 256, 00:29:33.518 "data_size": 7936 00:29:33.518 } 00:29:33.518 ] 00:29:33.518 }' 00:29:33.518 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:33.518 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:33.518 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:33.775 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:33.775 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:33.775 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:33.776 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:33.776 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:33.776 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:33.776 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:33.776 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:33.776 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:33.776 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:33.776 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:33.776 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:33.776 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.033 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:34.033 "name": "raid_bdev1", 00:29:34.033 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:34.033 "strip_size_kb": 0, 00:29:34.033 "state": "online", 00:29:34.033 "raid_level": "raid1", 00:29:34.033 "superblock": true, 00:29:34.033 "num_base_bdevs": 2, 00:29:34.033 "num_base_bdevs_discovered": 2, 00:29:34.033 "num_base_bdevs_operational": 2, 00:29:34.033 "base_bdevs_list": [ 00:29:34.033 { 00:29:34.033 "name": "spare", 00:29:34.033 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:34.033 "is_configured": true, 00:29:34.033 "data_offset": 256, 00:29:34.033 "data_size": 7936 00:29:34.033 }, 00:29:34.033 { 00:29:34.033 "name": "BaseBdev2", 00:29:34.033 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:34.033 "is_configured": true, 00:29:34.033 "data_offset": 256, 00:29:34.033 "data_size": 7936 00:29:34.033 } 00:29:34.033 ] 00:29:34.033 }' 00:29:34.033 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:34.033 00:24:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:34.621 00:24:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:34.621 [2024-07-16 00:24:21.515849] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:34.621 [2024-07-16 00:24:21.515875] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:34.621 [2024-07-16 00:24:21.515936] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:34.621 [2024-07-16 00:24:21.515991] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:34.621 [2024-07-16 00:24:21.516003] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1745370 name raid_bdev1, state offline 00:29:34.621 00:24:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.621 00:24:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:29:34.878 00:24:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:34.878 00:24:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:29:34.878 00:24:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:29:34.878 00:24:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:35.135 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:35.393 [2024-07-16 00:24:22.309906] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:35.393 [2024-07-16 00:24:22.309964] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:35.393 [2024-07-16 00:24:22.309987] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1745040 00:29:35.393 [2024-07-16 00:24:22.310000] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:35.393 [2024-07-16 00:24:22.311494] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:35.393 [2024-07-16 00:24:22.311521] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:35.393 [2024-07-16 00:24:22.311582] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:35.393 [2024-07-16 00:24:22.311606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:35.393 [2024-07-16 00:24:22.311694] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:35.393 spare 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:35.393 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:35.650 [2024-07-16 00:24:22.412006] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1745f60 00:29:35.650 [2024-07-16 00:24:22.412027] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:35.650 [2024-07-16 00:24:22.412109] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1745de0 00:29:35.650 [2024-07-16 00:24:22.412216] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1745f60 00:29:35.650 [2024-07-16 00:24:22.412226] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1745f60 00:29:35.650 [2024-07-16 00:24:22.412299] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:35.907 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:35.907 "name": "raid_bdev1", 00:29:35.907 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:35.907 "strip_size_kb": 0, 00:29:35.907 "state": "online", 00:29:35.907 "raid_level": "raid1", 00:29:35.907 "superblock": true, 00:29:35.907 "num_base_bdevs": 2, 00:29:35.907 "num_base_bdevs_discovered": 2, 00:29:35.907 "num_base_bdevs_operational": 2, 00:29:35.907 "base_bdevs_list": [ 00:29:35.907 { 00:29:35.907 "name": "spare", 00:29:35.907 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:35.907 "is_configured": true, 00:29:35.907 "data_offset": 256, 00:29:35.907 "data_size": 7936 00:29:35.907 }, 00:29:35.907 { 00:29:35.907 "name": "BaseBdev2", 00:29:35.907 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:35.907 "is_configured": true, 00:29:35.907 "data_offset": 256, 00:29:35.907 "data_size": 7936 00:29:35.907 } 00:29:35.907 ] 00:29:35.907 }' 00:29:35.907 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:35.907 00:24:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:36.839 00:24:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:36.839 00:24:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:36.839 00:24:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:36.839 00:24:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:36.839 00:24:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:36.839 00:24:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.839 00:24:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:37.404 00:24:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:37.404 "name": "raid_bdev1", 00:29:37.404 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:37.404 "strip_size_kb": 0, 00:29:37.404 "state": "online", 00:29:37.404 "raid_level": "raid1", 00:29:37.404 "superblock": true, 00:29:37.404 "num_base_bdevs": 2, 00:29:37.404 "num_base_bdevs_discovered": 2, 00:29:37.404 "num_base_bdevs_operational": 2, 00:29:37.404 "base_bdevs_list": [ 00:29:37.404 { 00:29:37.405 "name": "spare", 00:29:37.405 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:37.405 "is_configured": true, 00:29:37.405 "data_offset": 256, 00:29:37.405 "data_size": 7936 00:29:37.405 }, 00:29:37.405 { 00:29:37.405 "name": "BaseBdev2", 00:29:37.405 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:37.405 "is_configured": true, 00:29:37.405 "data_offset": 256, 00:29:37.405 "data_size": 7936 00:29:37.405 } 00:29:37.405 ] 00:29:37.405 }' 00:29:37.405 00:24:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:37.405 00:24:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:37.405 00:24:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:37.663 00:24:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:37.663 00:24:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:37.663 00:24:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:37.663 00:24:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:37.663 00:24:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:38.229 [2024-07-16 00:24:25.093482] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:38.229 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:38.230 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:38.489 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:38.489 "name": "raid_bdev1", 00:29:38.489 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:38.489 "strip_size_kb": 0, 00:29:38.489 "state": "online", 00:29:38.489 "raid_level": "raid1", 00:29:38.489 "superblock": true, 00:29:38.489 "num_base_bdevs": 2, 00:29:38.489 "num_base_bdevs_discovered": 1, 00:29:38.489 "num_base_bdevs_operational": 1, 00:29:38.489 "base_bdevs_list": [ 00:29:38.489 { 00:29:38.489 "name": null, 00:29:38.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:38.489 "is_configured": false, 00:29:38.489 "data_offset": 256, 00:29:38.489 "data_size": 7936 00:29:38.489 }, 00:29:38.489 { 00:29:38.489 "name": "BaseBdev2", 00:29:38.489 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:38.489 "is_configured": true, 00:29:38.489 "data_offset": 256, 00:29:38.489 "data_size": 7936 00:29:38.489 } 00:29:38.489 ] 00:29:38.489 }' 00:29:38.489 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:38.489 00:24:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:39.424 00:24:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:39.682 [2024-07-16 00:24:26.473164] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:39.682 [2024-07-16 00:24:26.473308] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:39.682 [2024-07-16 00:24:26.473324] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:39.682 [2024-07-16 00:24:26.473352] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:39.682 [2024-07-16 00:24:26.476824] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17473a0 00:29:39.682 [2024-07-16 00:24:26.478236] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:39.682 00:24:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:40.615 00:24:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:40.615 00:24:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:40.615 00:24:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:40.615 00:24:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:40.615 00:24:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:40.615 00:24:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.615 00:24:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:41.182 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:41.182 "name": "raid_bdev1", 00:29:41.182 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:41.182 "strip_size_kb": 0, 00:29:41.182 "state": "online", 00:29:41.182 "raid_level": "raid1", 00:29:41.182 "superblock": true, 00:29:41.182 "num_base_bdevs": 2, 00:29:41.182 "num_base_bdevs_discovered": 2, 00:29:41.182 "num_base_bdevs_operational": 2, 00:29:41.182 "process": { 00:29:41.182 "type": "rebuild", 00:29:41.182 "target": "spare", 00:29:41.182 "progress": { 00:29:41.182 "blocks": 3840, 00:29:41.182 "percent": 48 00:29:41.182 } 00:29:41.182 }, 00:29:41.182 "base_bdevs_list": [ 00:29:41.182 { 00:29:41.182 "name": "spare", 00:29:41.182 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:41.182 "is_configured": true, 00:29:41.182 "data_offset": 256, 00:29:41.182 "data_size": 7936 00:29:41.182 }, 00:29:41.182 { 00:29:41.182 "name": "BaseBdev2", 00:29:41.182 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:41.182 "is_configured": true, 00:29:41.182 "data_offset": 256, 00:29:41.182 "data_size": 7936 00:29:41.182 } 00:29:41.182 ] 00:29:41.182 }' 00:29:41.182 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:41.182 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:41.182 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:41.439 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:41.439 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:41.697 [2024-07-16 00:24:28.636877] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:41.955 [2024-07-16 00:24:28.695330] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:41.955 [2024-07-16 00:24:28.695383] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:41.955 [2024-07-16 00:24:28.695398] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:41.955 [2024-07-16 00:24:28.695407] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:41.955 00:24:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:42.521 00:24:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:42.521 "name": "raid_bdev1", 00:29:42.521 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:42.521 "strip_size_kb": 0, 00:29:42.521 "state": "online", 00:29:42.521 "raid_level": "raid1", 00:29:42.521 "superblock": true, 00:29:42.521 "num_base_bdevs": 2, 00:29:42.521 "num_base_bdevs_discovered": 1, 00:29:42.521 "num_base_bdevs_operational": 1, 00:29:42.521 "base_bdevs_list": [ 00:29:42.521 { 00:29:42.521 "name": null, 00:29:42.521 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:42.521 "is_configured": false, 00:29:42.521 "data_offset": 256, 00:29:42.521 "data_size": 7936 00:29:42.521 }, 00:29:42.521 { 00:29:42.521 "name": "BaseBdev2", 00:29:42.521 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:42.521 "is_configured": true, 00:29:42.521 "data_offset": 256, 00:29:42.521 "data_size": 7936 00:29:42.521 } 00:29:42.521 ] 00:29:42.521 }' 00:29:42.521 00:24:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:42.521 00:24:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:43.088 00:24:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:43.346 [2024-07-16 00:24:30.071394] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:43.346 [2024-07-16 00:24:30.071444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:43.346 [2024-07-16 00:24:30.071469] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1744c80 00:29:43.346 [2024-07-16 00:24:30.071482] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:43.346 [2024-07-16 00:24:30.071667] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:43.346 [2024-07-16 00:24:30.071683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:43.346 [2024-07-16 00:24:30.071741] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:43.346 [2024-07-16 00:24:30.071753] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:43.346 [2024-07-16 00:24:30.071764] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:43.346 [2024-07-16 00:24:30.071782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:43.346 [2024-07-16 00:24:30.075264] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17452d0 00:29:43.346 [2024-07-16 00:24:30.076592] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:43.346 spare 00:29:43.346 00:24:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:44.280 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:44.280 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:44.280 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:44.280 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:44.280 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:44.280 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.280 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:44.538 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:44.538 "name": "raid_bdev1", 00:29:44.538 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:44.538 "strip_size_kb": 0, 00:29:44.538 "state": "online", 00:29:44.538 "raid_level": "raid1", 00:29:44.538 "superblock": true, 00:29:44.538 "num_base_bdevs": 2, 00:29:44.538 "num_base_bdevs_discovered": 2, 00:29:44.538 "num_base_bdevs_operational": 2, 00:29:44.538 "process": { 00:29:44.538 "type": "rebuild", 00:29:44.538 "target": "spare", 00:29:44.538 "progress": { 00:29:44.538 "blocks": 3072, 00:29:44.538 "percent": 38 00:29:44.538 } 00:29:44.538 }, 00:29:44.538 "base_bdevs_list": [ 00:29:44.538 { 00:29:44.538 "name": "spare", 00:29:44.538 "uuid": "5b1713ce-9e86-5d44-9f34-7214ad643e20", 00:29:44.538 "is_configured": true, 00:29:44.538 "data_offset": 256, 00:29:44.538 "data_size": 7936 00:29:44.538 }, 00:29:44.538 { 00:29:44.538 "name": "BaseBdev2", 00:29:44.538 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:44.538 "is_configured": true, 00:29:44.538 "data_offset": 256, 00:29:44.538 "data_size": 7936 00:29:44.538 } 00:29:44.538 ] 00:29:44.538 }' 00:29:44.539 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:44.539 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:44.539 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:44.539 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:44.539 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:44.797 [2024-07-16 00:24:31.653648] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:44.797 [2024-07-16 00:24:31.689073] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:44.797 [2024-07-16 00:24:31.689117] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:44.797 [2024-07-16 00:24:31.689132] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:44.797 [2024-07-16 00:24:31.689140] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.797 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:45.055 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.055 "name": "raid_bdev1", 00:29:45.055 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:45.055 "strip_size_kb": 0, 00:29:45.055 "state": "online", 00:29:45.055 "raid_level": "raid1", 00:29:45.055 "superblock": true, 00:29:45.055 "num_base_bdevs": 2, 00:29:45.055 "num_base_bdevs_discovered": 1, 00:29:45.055 "num_base_bdevs_operational": 1, 00:29:45.055 "base_bdevs_list": [ 00:29:45.055 { 00:29:45.055 "name": null, 00:29:45.055 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.055 "is_configured": false, 00:29:45.055 "data_offset": 256, 00:29:45.055 "data_size": 7936 00:29:45.055 }, 00:29:45.055 { 00:29:45.055 "name": "BaseBdev2", 00:29:45.055 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:45.055 "is_configured": true, 00:29:45.055 "data_offset": 256, 00:29:45.055 "data_size": 7936 00:29:45.055 } 00:29:45.055 ] 00:29:45.055 }' 00:29:45.055 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.055 00:24:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:45.621 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:45.621 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:45.621 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:45.621 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:45.621 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:45.621 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.621 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:45.879 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:45.879 "name": "raid_bdev1", 00:29:45.879 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:45.879 "strip_size_kb": 0, 00:29:45.879 "state": "online", 00:29:45.879 "raid_level": "raid1", 00:29:45.879 "superblock": true, 00:29:45.879 "num_base_bdevs": 2, 00:29:45.879 "num_base_bdevs_discovered": 1, 00:29:45.879 "num_base_bdevs_operational": 1, 00:29:45.879 "base_bdevs_list": [ 00:29:45.879 { 00:29:45.879 "name": null, 00:29:45.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.879 "is_configured": false, 00:29:45.879 "data_offset": 256, 00:29:45.879 "data_size": 7936 00:29:45.879 }, 00:29:45.879 { 00:29:45.879 "name": "BaseBdev2", 00:29:45.879 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:45.879 "is_configured": true, 00:29:45.879 "data_offset": 256, 00:29:45.879 "data_size": 7936 00:29:45.879 } 00:29:45.879 ] 00:29:45.879 }' 00:29:45.879 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:46.137 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:46.137 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:46.137 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:46.137 00:24:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:46.702 00:24:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:47.268 [2024-07-16 00:24:33.918912] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:47.268 [2024-07-16 00:24:33.918970] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:47.268 [2024-07-16 00:24:33.918996] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15acfa0 00:29:47.268 [2024-07-16 00:24:33.919009] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:47.268 [2024-07-16 00:24:33.919170] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:47.268 [2024-07-16 00:24:33.919186] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:47.268 [2024-07-16 00:24:33.919234] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:47.268 [2024-07-16 00:24:33.919245] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:47.268 [2024-07-16 00:24:33.919256] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:47.268 BaseBdev1 00:29:47.268 00:24:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.201 00:24:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:48.769 00:24:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:48.769 "name": "raid_bdev1", 00:29:48.769 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:48.769 "strip_size_kb": 0, 00:29:48.769 "state": "online", 00:29:48.769 "raid_level": "raid1", 00:29:48.769 "superblock": true, 00:29:48.769 "num_base_bdevs": 2, 00:29:48.769 "num_base_bdevs_discovered": 1, 00:29:48.769 "num_base_bdevs_operational": 1, 00:29:48.769 "base_bdevs_list": [ 00:29:48.769 { 00:29:48.769 "name": null, 00:29:48.769 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:48.769 "is_configured": false, 00:29:48.769 "data_offset": 256, 00:29:48.769 "data_size": 7936 00:29:48.769 }, 00:29:48.769 { 00:29:48.769 "name": "BaseBdev2", 00:29:48.769 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:48.769 "is_configured": true, 00:29:48.769 "data_offset": 256, 00:29:48.769 "data_size": 7936 00:29:48.769 } 00:29:48.769 ] 00:29:48.769 }' 00:29:48.769 00:24:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:48.769 00:24:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:49.334 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:49.334 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:49.334 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:49.335 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:49.335 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:49.335 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:49.335 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:49.632 "name": "raid_bdev1", 00:29:49.632 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:49.632 "strip_size_kb": 0, 00:29:49.632 "state": "online", 00:29:49.632 "raid_level": "raid1", 00:29:49.632 "superblock": true, 00:29:49.632 "num_base_bdevs": 2, 00:29:49.632 "num_base_bdevs_discovered": 1, 00:29:49.632 "num_base_bdevs_operational": 1, 00:29:49.632 "base_bdevs_list": [ 00:29:49.632 { 00:29:49.632 "name": null, 00:29:49.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:49.632 "is_configured": false, 00:29:49.632 "data_offset": 256, 00:29:49.632 "data_size": 7936 00:29:49.632 }, 00:29:49.632 { 00:29:49.632 "name": "BaseBdev2", 00:29:49.632 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:49.632 "is_configured": true, 00:29:49.632 "data_offset": 256, 00:29:49.632 "data_size": 7936 00:29:49.632 } 00:29:49.632 ] 00:29:49.632 }' 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:49.632 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:49.924 [2024-07-16 00:24:36.698295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:49.924 [2024-07-16 00:24:36.698415] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:49.924 [2024-07-16 00:24:36.698429] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:49.924 request: 00:29:49.924 { 00:29:49.924 "base_bdev": "BaseBdev1", 00:29:49.924 "raid_bdev": "raid_bdev1", 00:29:49.924 "method": "bdev_raid_add_base_bdev", 00:29:49.924 "req_id": 1 00:29:49.924 } 00:29:49.924 Got JSON-RPC error response 00:29:49.924 response: 00:29:49.924 { 00:29:49.924 "code": -22, 00:29:49.924 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:49.924 } 00:29:49.924 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:49.924 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:49.924 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:49.924 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:49.924 00:24:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:50.857 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:50.857 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:50.857 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:50.857 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:50.857 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:50.857 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:50.857 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:50.857 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:50.857 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:50.857 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:50.858 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:50.858 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:51.116 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:51.116 "name": "raid_bdev1", 00:29:51.116 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:51.116 "strip_size_kb": 0, 00:29:51.116 "state": "online", 00:29:51.116 "raid_level": "raid1", 00:29:51.116 "superblock": true, 00:29:51.116 "num_base_bdevs": 2, 00:29:51.116 "num_base_bdevs_discovered": 1, 00:29:51.116 "num_base_bdevs_operational": 1, 00:29:51.116 "base_bdevs_list": [ 00:29:51.116 { 00:29:51.116 "name": null, 00:29:51.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:51.116 "is_configured": false, 00:29:51.116 "data_offset": 256, 00:29:51.116 "data_size": 7936 00:29:51.116 }, 00:29:51.116 { 00:29:51.116 "name": "BaseBdev2", 00:29:51.116 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:51.116 "is_configured": true, 00:29:51.116 "data_offset": 256, 00:29:51.116 "data_size": 7936 00:29:51.116 } 00:29:51.116 ] 00:29:51.116 }' 00:29:51.116 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:51.116 00:24:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:51.683 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:51.683 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:51.683 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:51.683 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:51.683 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:51.683 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:51.683 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:51.942 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:51.942 "name": "raid_bdev1", 00:29:51.942 "uuid": "25bdca3b-ce4d-4707-acd7-1723795221b5", 00:29:51.942 "strip_size_kb": 0, 00:29:51.942 "state": "online", 00:29:51.942 "raid_level": "raid1", 00:29:51.942 "superblock": true, 00:29:51.942 "num_base_bdevs": 2, 00:29:51.942 "num_base_bdevs_discovered": 1, 00:29:51.942 "num_base_bdevs_operational": 1, 00:29:51.942 "base_bdevs_list": [ 00:29:51.942 { 00:29:51.942 "name": null, 00:29:51.942 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:51.942 "is_configured": false, 00:29:51.942 "data_offset": 256, 00:29:51.942 "data_size": 7936 00:29:51.942 }, 00:29:51.942 { 00:29:51.942 "name": "BaseBdev2", 00:29:51.942 "uuid": "6d8284cf-0dfc-53d8-97fb-1de90616f413", 00:29:51.942 "is_configured": true, 00:29:51.942 "data_offset": 256, 00:29:51.942 "data_size": 7936 00:29:51.942 } 00:29:51.942 ] 00:29:51.942 }' 00:29:51.942 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:51.942 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:51.942 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 3652866 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 3652866 ']' 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 3652866 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3652866 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3652866' 00:29:52.201 killing process with pid 3652866 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 3652866 00:29:52.201 Received shutdown signal, test time was about 60.000000 seconds 00:29:52.201 00:29:52.201 Latency(us) 00:29:52.201 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:52.201 =================================================================================================================== 00:29:52.201 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:52.201 [2024-07-16 00:24:38.964229] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:52.201 [2024-07-16 00:24:38.964318] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:52.201 [2024-07-16 00:24:38.964359] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:52.201 [2024-07-16 00:24:38.964371] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1745f60 name raid_bdev1, state offline 00:29:52.201 00:24:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 3652866 00:29:52.201 [2024-07-16 00:24:38.991030] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:52.459 00:24:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:29:52.459 00:29:52.459 real 0m31.741s 00:29:52.459 user 0m51.651s 00:29:52.459 sys 0m4.427s 00:29:52.459 00:24:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:52.459 00:24:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:52.459 ************************************ 00:29:52.459 END TEST raid_rebuild_test_sb_md_interleaved 00:29:52.459 ************************************ 00:29:52.459 00:24:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:52.459 00:24:39 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:29:52.459 00:24:39 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:29:52.459 00:24:39 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 3652866 ']' 00:29:52.459 00:24:39 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 3652866 00:29:52.459 00:24:39 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:29:52.459 00:29:52.459 real 19m18.780s 00:29:52.459 user 32m48.004s 00:29:52.459 sys 3m31.014s 00:29:52.459 00:24:39 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:52.459 00:24:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:52.459 ************************************ 00:29:52.459 END TEST bdev_raid 00:29:52.459 ************************************ 00:29:52.459 00:24:39 -- common/autotest_common.sh@1142 -- # return 0 00:29:52.459 00:24:39 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:52.459 00:24:39 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:52.459 00:24:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:52.459 00:24:39 -- common/autotest_common.sh@10 -- # set +x 00:29:52.459 ************************************ 00:29:52.459 START TEST bdevperf_config 00:29:52.459 ************************************ 00:29:52.459 00:24:39 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:52.718 * Looking for test storage... 00:29:52.718 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.718 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.718 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.718 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.718 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.718 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.718 00:24:39 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:55.998 00:24:42 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-16 00:24:39.557908] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:29:55.999 [2024-07-16 00:24:39.557973] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3657356 ] 00:29:55.999 Using job config with 4 jobs 00:29:55.999 [2024-07-16 00:24:39.688558] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.999 [2024-07-16 00:24:39.807038] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.999 cpumask for '\''job0'\'' is too big 00:29:55.999 cpumask for '\''job1'\'' is too big 00:29:55.999 cpumask for '\''job2'\'' is too big 00:29:55.999 cpumask for '\''job3'\'' is too big 00:29:55.999 Running I/O for 2 seconds... 00:29:55.999 00:29:55.999 Latency(us) 00:29:55.999 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.01 23775.27 23.22 0.00 0.00 10754.06 1894.85 16526.47 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.02 23785.00 23.23 0.00 0.00 10724.80 1866.35 14588.88 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.03 23763.15 23.21 0.00 0.00 10710.42 1852.10 12879.25 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.03 23741.41 23.18 0.00 0.00 10696.75 1852.10 12879.25 00:29:55.999 =================================================================================================================== 00:29:55.999 Total : 95064.83 92.84 0.00 0.00 10721.46 1852.10 16526.47' 00:29:55.999 00:24:42 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-16 00:24:39.557908] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:29:55.999 [2024-07-16 00:24:39.557973] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3657356 ] 00:29:55.999 Using job config with 4 jobs 00:29:55.999 [2024-07-16 00:24:39.688558] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.999 [2024-07-16 00:24:39.807038] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.999 cpumask for '\''job0'\'' is too big 00:29:55.999 cpumask for '\''job1'\'' is too big 00:29:55.999 cpumask for '\''job2'\'' is too big 00:29:55.999 cpumask for '\''job3'\'' is too big 00:29:55.999 Running I/O for 2 seconds... 00:29:55.999 00:29:55.999 Latency(us) 00:29:55.999 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.01 23775.27 23.22 0.00 0.00 10754.06 1894.85 16526.47 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.02 23785.00 23.23 0.00 0.00 10724.80 1866.35 14588.88 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.03 23763.15 23.21 0.00 0.00 10710.42 1852.10 12879.25 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.03 23741.41 23.18 0.00 0.00 10696.75 1852.10 12879.25 00:29:55.999 =================================================================================================================== 00:29:55.999 Total : 95064.83 92.84 0.00 0.00 10721.46 1852.10 16526.47' 00:29:55.999 00:24:42 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-16 00:24:39.557908] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:29:55.999 [2024-07-16 00:24:39.557973] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3657356 ] 00:29:55.999 Using job config with 4 jobs 00:29:55.999 [2024-07-16 00:24:39.688558] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.999 [2024-07-16 00:24:39.807038] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.999 cpumask for '\''job0'\'' is too big 00:29:55.999 cpumask for '\''job1'\'' is too big 00:29:55.999 cpumask for '\''job2'\'' is too big 00:29:55.999 cpumask for '\''job3'\'' is too big 00:29:55.999 Running I/O for 2 seconds... 00:29:55.999 00:29:55.999 Latency(us) 00:29:55.999 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.01 23775.27 23.22 0.00 0.00 10754.06 1894.85 16526.47 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.02 23785.00 23.23 0.00 0.00 10724.80 1866.35 14588.88 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.03 23763.15 23.21 0.00 0.00 10710.42 1852.10 12879.25 00:29:55.999 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:55.999 Malloc0 : 2.03 23741.41 23.18 0.00 0.00 10696.75 1852.10 12879.25 00:29:55.999 =================================================================================================================== 00:29:55.999 Total : 95064.83 92.84 0.00 0.00 10721.46 1852.10 16526.47' 00:29:55.999 00:24:42 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:55.999 00:24:42 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:55.999 00:24:42 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:29:55.999 00:24:42 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:55.999 [2024-07-16 00:24:42.344086] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:29:55.999 [2024-07-16 00:24:42.344162] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3657711 ] 00:29:55.999 [2024-07-16 00:24:42.493412] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.999 [2024-07-16 00:24:42.618469] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.999 cpumask for 'job0' is too big 00:29:55.999 cpumask for 'job1' is too big 00:29:55.999 cpumask for 'job2' is too big 00:29:55.999 cpumask for 'job3' is too big 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:29:58.522 Running I/O for 2 seconds... 00:29:58.522 00:29:58.522 Latency(us) 00:29:58.522 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:58.522 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:58.522 Malloc0 : 2.02 23815.25 23.26 0.00 0.00 10734.50 1880.60 16526.47 00:29:58.522 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:58.522 Malloc0 : 2.02 23793.43 23.24 0.00 0.00 10720.26 1866.35 14588.88 00:29:58.522 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:58.522 Malloc0 : 2.02 23771.68 23.21 0.00 0.00 10705.09 1852.10 12708.29 00:29:58.522 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:58.522 Malloc0 : 2.03 23749.97 23.19 0.00 0.00 10690.67 1852.10 10998.65 00:29:58.522 =================================================================================================================== 00:29:58.522 Total : 95130.33 92.90 0.00 0.00 10712.63 1852.10 16526.47' 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:58.522 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:58.522 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:29:58.522 00:24:45 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:58.523 00:24:45 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:58.523 00:24:45 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:58.523 00:24:45 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:58.523 00:24:45 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:58.523 00:24:45 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:58.523 00:29:58.523 00:24:45 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:58.523 00:24:45 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:01.056 00:24:47 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-16 00:24:45.146003] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:01.056 [2024-07-16 00:24:45.146071] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3658070 ] 00:30:01.056 Using job config with 3 jobs 00:30:01.056 [2024-07-16 00:24:45.287444] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:01.056 [2024-07-16 00:24:45.409953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:01.056 cpumask for '\''job0'\'' is too big 00:30:01.056 cpumask for '\''job1'\'' is too big 00:30:01.056 cpumask for '\''job2'\'' is too big 00:30:01.056 Running I/O for 2 seconds... 00:30:01.056 00:30:01.056 Latency(us) 00:30:01.056 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:01.056 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:01.056 Malloc0 : 2.01 32298.57 31.54 0.00 0.00 7909.78 1837.86 11682.50 00:30:01.056 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:01.056 Malloc0 : 2.02 32268.84 31.51 0.00 0.00 7899.39 1837.86 9858.89 00:30:01.056 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:01.056 Malloc0 : 2.02 32239.24 31.48 0.00 0.00 7889.24 1837.86 8206.25 00:30:01.056 =================================================================================================================== 00:30:01.056 Total : 96806.65 94.54 0.00 0.00 7899.47 1837.86 11682.50' 00:30:01.056 00:24:47 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-16 00:24:45.146003] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:01.056 [2024-07-16 00:24:45.146071] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3658070 ] 00:30:01.056 Using job config with 3 jobs 00:30:01.056 [2024-07-16 00:24:45.287444] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:01.056 [2024-07-16 00:24:45.409953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:01.056 cpumask for '\''job0'\'' is too big 00:30:01.056 cpumask for '\''job1'\'' is too big 00:30:01.056 cpumask for '\''job2'\'' is too big 00:30:01.056 Running I/O for 2 seconds... 00:30:01.056 00:30:01.056 Latency(us) 00:30:01.056 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:01.056 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:01.056 Malloc0 : 2.01 32298.57 31.54 0.00 0.00 7909.78 1837.86 11682.50 00:30:01.056 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:01.056 Malloc0 : 2.02 32268.84 31.51 0.00 0.00 7899.39 1837.86 9858.89 00:30:01.056 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:01.056 Malloc0 : 2.02 32239.24 31.48 0.00 0.00 7889.24 1837.86 8206.25 00:30:01.056 =================================================================================================================== 00:30:01.056 Total : 96806.65 94.54 0.00 0.00 7899.47 1837.86 11682.50' 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-16 00:24:45.146003] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:01.057 [2024-07-16 00:24:45.146071] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3658070 ] 00:30:01.057 Using job config with 3 jobs 00:30:01.057 [2024-07-16 00:24:45.287444] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:01.057 [2024-07-16 00:24:45.409953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:01.057 cpumask for '\''job0'\'' is too big 00:30:01.057 cpumask for '\''job1'\'' is too big 00:30:01.057 cpumask for '\''job2'\'' is too big 00:30:01.057 Running I/O for 2 seconds... 00:30:01.057 00:30:01.057 Latency(us) 00:30:01.057 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:01.057 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:01.057 Malloc0 : 2.01 32298.57 31.54 0.00 0.00 7909.78 1837.86 11682.50 00:30:01.057 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:01.057 Malloc0 : 2.02 32268.84 31.51 0.00 0.00 7899.39 1837.86 9858.89 00:30:01.057 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:01.057 Malloc0 : 2.02 32239.24 31.48 0.00 0.00 7889.24 1837.86 8206.25 00:30:01.057 =================================================================================================================== 00:30:01.057 Total : 96806.65 94.54 0.00 0.00 7899.47 1837.86 11682.50' 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:01.057 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:01.057 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:01.057 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:01.057 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:01.057 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:01.057 00:24:47 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:04.349 00:24:50 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-16 00:24:47.985167] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:04.349 [2024-07-16 00:24:47.985299] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3658423 ] 00:30:04.349 Using job config with 4 jobs 00:30:04.349 [2024-07-16 00:24:48.202376] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:04.349 [2024-07-16 00:24:48.317817] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:04.349 cpumask for '\''job0'\'' is too big 00:30:04.349 cpumask for '\''job1'\'' is too big 00:30:04.349 cpumask for '\''job2'\'' is too big 00:30:04.349 cpumask for '\''job3'\'' is too big 00:30:04.349 Running I/O for 2 seconds... 00:30:04.349 00:30:04.349 Latency(us) 00:30:04.349 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.04 11912.55 11.63 0.00 0.00 21471.27 3875.17 33508.84 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.04 11901.42 11.62 0.00 0.00 21470.20 4673.00 33508.84 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.05 11890.63 11.61 0.00 0.00 21413.40 3818.18 29633.67 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.05 11879.62 11.60 0.00 0.00 21412.60 4644.51 29633.67 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.05 11868.90 11.59 0.00 0.00 21354.39 3818.18 25758.50 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.05 11857.94 11.58 0.00 0.00 21354.84 4644.51 25644.52 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.05 11847.24 11.57 0.00 0.00 21296.62 4074.63 24162.84 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.05 11836.33 11.56 0.00 0.00 21294.08 4900.95 24048.86 00:30:04.349 =================================================================================================================== 00:30:04.349 Total : 94994.63 92.77 0.00 0.00 21383.42 3818.18 33508.84' 00:30:04.349 00:24:50 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-16 00:24:47.985167] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:04.349 [2024-07-16 00:24:47.985299] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3658423 ] 00:30:04.349 Using job config with 4 jobs 00:30:04.349 [2024-07-16 00:24:48.202376] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:04.349 [2024-07-16 00:24:48.317817] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:04.349 cpumask for '\''job0'\'' is too big 00:30:04.349 cpumask for '\''job1'\'' is too big 00:30:04.349 cpumask for '\''job2'\'' is too big 00:30:04.349 cpumask for '\''job3'\'' is too big 00:30:04.349 Running I/O for 2 seconds... 00:30:04.349 00:30:04.349 Latency(us) 00:30:04.349 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.04 11912.55 11.63 0.00 0.00 21471.27 3875.17 33508.84 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.04 11901.42 11.62 0.00 0.00 21470.20 4673.00 33508.84 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.05 11890.63 11.61 0.00 0.00 21413.40 3818.18 29633.67 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.05 11879.62 11.60 0.00 0.00 21412.60 4644.51 29633.67 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.05 11868.90 11.59 0.00 0.00 21354.39 3818.18 25758.50 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.05 11857.94 11.58 0.00 0.00 21354.84 4644.51 25644.52 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.05 11847.24 11.57 0.00 0.00 21296.62 4074.63 24162.84 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.05 11836.33 11.56 0.00 0.00 21294.08 4900.95 24048.86 00:30:04.349 =================================================================================================================== 00:30:04.349 Total : 94994.63 92.77 0.00 0.00 21383.42 3818.18 33508.84' 00:30:04.349 00:24:50 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-16 00:24:47.985167] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:04.349 [2024-07-16 00:24:47.985299] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3658423 ] 00:30:04.349 Using job config with 4 jobs 00:30:04.349 [2024-07-16 00:24:48.202376] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:04.349 [2024-07-16 00:24:48.317817] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:04.349 cpumask for '\''job0'\'' is too big 00:30:04.349 cpumask for '\''job1'\'' is too big 00:30:04.349 cpumask for '\''job2'\'' is too big 00:30:04.349 cpumask for '\''job3'\'' is too big 00:30:04.349 Running I/O for 2 seconds... 00:30:04.349 00:30:04.349 Latency(us) 00:30:04.349 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.04 11912.55 11.63 0.00 0.00 21471.27 3875.17 33508.84 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.04 11901.42 11.62 0.00 0.00 21470.20 4673.00 33508.84 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.05 11890.63 11.61 0.00 0.00 21413.40 3818.18 29633.67 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.05 11879.62 11.60 0.00 0.00 21412.60 4644.51 29633.67 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.05 11868.90 11.59 0.00 0.00 21354.39 3818.18 25758.50 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.05 11857.94 11.58 0.00 0.00 21354.84 4644.51 25644.52 00:30:04.349 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc0 : 2.05 11847.24 11.57 0.00 0.00 21296.62 4074.63 24162.84 00:30:04.349 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:04.349 Malloc1 : 2.05 11836.33 11.56 0.00 0.00 21294.08 4900.95 24048.86 00:30:04.349 =================================================================================================================== 00:30:04.349 Total : 94994.63 92.77 0.00 0.00 21383.42 3818.18 33508.84' 00:30:04.349 00:24:50 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:04.349 00:24:50 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:04.349 00:24:50 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:30:04.349 00:24:50 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:30:04.349 00:24:50 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:04.349 00:24:50 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:30:04.349 00:30:04.349 real 0m11.432s 00:30:04.349 user 0m10.002s 00:30:04.349 sys 0m1.278s 00:30:04.349 00:24:50 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:04.349 00:24:50 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:30:04.349 ************************************ 00:30:04.349 END TEST bdevperf_config 00:30:04.349 ************************************ 00:30:04.349 00:24:50 -- common/autotest_common.sh@1142 -- # return 0 00:30:04.349 00:24:50 -- spdk/autotest.sh@192 -- # uname -s 00:30:04.349 00:24:50 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:30:04.349 00:24:50 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:04.349 00:24:50 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:04.349 00:24:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:04.349 00:24:50 -- common/autotest_common.sh@10 -- # set +x 00:30:04.349 ************************************ 00:30:04.349 START TEST reactor_set_interrupt 00:30:04.349 ************************************ 00:30:04.349 00:24:50 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:04.349 * Looking for test storage... 00:30:04.349 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:04.349 00:24:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:04.349 00:24:50 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:04.349 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:04.349 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:04.349 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:04.350 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:04.350 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:04.350 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:04.350 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:30:04.350 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:04.350 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:04.350 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:04.350 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:04.350 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:04.350 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:04.350 00:24:51 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:04.350 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:04.350 00:24:51 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:04.350 #define SPDK_CONFIG_H 00:30:04.350 #define SPDK_CONFIG_APPS 1 00:30:04.350 #define SPDK_CONFIG_ARCH native 00:30:04.350 #undef SPDK_CONFIG_ASAN 00:30:04.350 #undef SPDK_CONFIG_AVAHI 00:30:04.350 #undef SPDK_CONFIG_CET 00:30:04.350 #define SPDK_CONFIG_COVERAGE 1 00:30:04.350 #define SPDK_CONFIG_CROSS_PREFIX 00:30:04.350 #define SPDK_CONFIG_CRYPTO 1 00:30:04.350 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:04.350 #undef SPDK_CONFIG_CUSTOMOCF 00:30:04.350 #undef SPDK_CONFIG_DAOS 00:30:04.350 #define SPDK_CONFIG_DAOS_DIR 00:30:04.350 #define SPDK_CONFIG_DEBUG 1 00:30:04.350 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:04.350 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:04.350 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:04.350 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:04.350 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:04.350 #undef SPDK_CONFIG_DPDK_UADK 00:30:04.350 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:04.350 #define SPDK_CONFIG_EXAMPLES 1 00:30:04.350 #undef SPDK_CONFIG_FC 00:30:04.350 #define SPDK_CONFIG_FC_PATH 00:30:04.350 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:04.350 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:04.350 #undef SPDK_CONFIG_FUSE 00:30:04.350 #undef SPDK_CONFIG_FUZZER 00:30:04.350 #define SPDK_CONFIG_FUZZER_LIB 00:30:04.350 #undef SPDK_CONFIG_GOLANG 00:30:04.350 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:04.351 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:04.351 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:04.351 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:04.351 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:04.351 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:04.351 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:04.351 #define SPDK_CONFIG_IDXD 1 00:30:04.351 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:04.351 #define SPDK_CONFIG_IPSEC_MB 1 00:30:04.351 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:04.351 #define SPDK_CONFIG_ISAL 1 00:30:04.351 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:04.351 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:04.351 #define SPDK_CONFIG_LIBDIR 00:30:04.351 #undef SPDK_CONFIG_LTO 00:30:04.351 #define SPDK_CONFIG_MAX_LCORES 128 00:30:04.351 #define SPDK_CONFIG_NVME_CUSE 1 00:30:04.351 #undef SPDK_CONFIG_OCF 00:30:04.351 #define SPDK_CONFIG_OCF_PATH 00:30:04.351 #define SPDK_CONFIG_OPENSSL_PATH 00:30:04.351 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:04.351 #define SPDK_CONFIG_PGO_DIR 00:30:04.351 #undef SPDK_CONFIG_PGO_USE 00:30:04.351 #define SPDK_CONFIG_PREFIX /usr/local 00:30:04.351 #undef SPDK_CONFIG_RAID5F 00:30:04.351 #undef SPDK_CONFIG_RBD 00:30:04.351 #define SPDK_CONFIG_RDMA 1 00:30:04.351 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:04.351 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:04.351 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:04.351 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:04.351 #define SPDK_CONFIG_SHARED 1 00:30:04.351 #undef SPDK_CONFIG_SMA 00:30:04.351 #define SPDK_CONFIG_TESTS 1 00:30:04.351 #undef SPDK_CONFIG_TSAN 00:30:04.351 #define SPDK_CONFIG_UBLK 1 00:30:04.351 #define SPDK_CONFIG_UBSAN 1 00:30:04.351 #undef SPDK_CONFIG_UNIT_TESTS 00:30:04.351 #undef SPDK_CONFIG_URING 00:30:04.351 #define SPDK_CONFIG_URING_PATH 00:30:04.351 #undef SPDK_CONFIG_URING_ZNS 00:30:04.351 #undef SPDK_CONFIG_USDT 00:30:04.351 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:04.351 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:04.351 #undef SPDK_CONFIG_VFIO_USER 00:30:04.351 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:04.351 #define SPDK_CONFIG_VHOST 1 00:30:04.351 #define SPDK_CONFIG_VIRTIO 1 00:30:04.351 #undef SPDK_CONFIG_VTUNE 00:30:04.351 #define SPDK_CONFIG_VTUNE_DIR 00:30:04.351 #define SPDK_CONFIG_WERROR 1 00:30:04.351 #define SPDK_CONFIG_WPDK_DIR 00:30:04.351 #undef SPDK_CONFIG_XNVME 00:30:04.351 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:04.351 00:24:51 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:04.351 00:24:51 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:04.351 00:24:51 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:04.351 00:24:51 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:04.351 00:24:51 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:04.351 00:24:51 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:04.351 00:24:51 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:04.351 00:24:51 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:30:04.351 00:24:51 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:04.351 00:24:51 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:30:04.351 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:04.352 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 3658861 ]] 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 3658861 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.sl3PsA 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.sl3PsA/tests/interrupt /tmp/spdk.sl3PsA 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88574316544 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5934198784 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892300288 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9404416 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253352448 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=905216 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:04.353 * Looking for test storage... 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88574316544 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=8148791296 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:04.353 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:30:04.353 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:30:04.354 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:04.354 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:04.354 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:04.354 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=3659000 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:04.354 00:24:51 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 3659000 /var/tmp/spdk.sock 00:30:04.354 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 3659000 ']' 00:30:04.354 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:04.354 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:04.354 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:04.354 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:04.354 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:04.354 00:24:51 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:04.354 [2024-07-16 00:24:51.223820] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:04.354 [2024-07-16 00:24:51.223890] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3659000 ] 00:30:04.613 [2024-07-16 00:24:51.355757] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:04.613 [2024-07-16 00:24:51.459528] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:04.613 [2024-07-16 00:24:51.459630] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:04.613 [2024-07-16 00:24:51.459630] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:04.613 [2024-07-16 00:24:51.532580] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:05.550 00:24:52 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:05.550 00:24:52 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:05.550 00:24:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:30:05.550 00:24:52 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:05.550 Malloc0 00:30:05.550 Malloc1 00:30:05.550 Malloc2 00:30:05.550 00:24:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:30:05.550 00:24:52 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:05.550 00:24:52 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:05.550 00:24:52 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:05.550 5000+0 records in 00:30:05.550 5000+0 records out 00:30:05.550 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0240954 s, 425 MB/s 00:30:05.550 00:24:52 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:05.809 AIO0 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 3659000 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 3659000 without_thd 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=3659000 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:05.809 00:24:52 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:06.069 00:24:52 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:06.069 00:24:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:06.069 00:24:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:06.069 00:24:52 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:06.069 00:24:52 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:06.069 00:24:52 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:06.069 00:24:52 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:06.069 00:24:52 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:06.069 00:24:52 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:06.329 spdk_thread ids are 1 on reactor0. 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3659000 0 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3659000 0 idle 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659000 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659000 -w 256 00:30:06.329 00:24:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659000 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0' 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659000 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:06.587 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3659000 1 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3659000 1 idle 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659000 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659000 -w 256 00:30:06.588 00:24:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659021 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659021 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3659000 2 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3659000 2 idle 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659000 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659000 -w 256 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659022 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659022 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:30:06.845 00:24:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:30:07.104 [2024-07-16 00:24:54.000422] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:07.104 00:24:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:07.362 [2024-07-16 00:24:54.256210] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:07.362 [2024-07-16 00:24:54.256627] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:07.362 00:24:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:07.621 [2024-07-16 00:24:54.508123] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:07.621 [2024-07-16 00:24:54.508370] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3659000 0 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3659000 0 busy 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659000 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659000 -w 256 00:30:07.621 00:24:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:07.879 00:24:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659000 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.85 reactor_0' 00:30:07.879 00:24:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:07.879 00:24:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659000 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.85 reactor_0 00:30:07.879 00:24:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:07.879 00:24:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3659000 2 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3659000 2 busy 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659000 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659000 -w 256 00:30:07.880 00:24:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:08.139 00:24:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659022 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2' 00:30:08.139 00:24:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659022 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2 00:30:08.139 00:24:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:08.139 00:24:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:08.139 00:24:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:08.139 00:24:54 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:08.139 00:24:54 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:08.140 00:24:54 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:08.140 00:24:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:08.140 00:24:54 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:08.140 00:24:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:08.397 [2024-07-16 00:24:55.112100] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:08.397 [2024-07-16 00:24:55.112252] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 3659000 2 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3659000 2 idle 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659000 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659000 -w 256 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659022 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2' 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659022 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:08.397 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:08.669 [2024-07-16 00:24:55.492102] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:08.669 [2024-07-16 00:24:55.492266] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:08.669 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:30:08.669 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:30:08.669 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:30:08.944 [2024-07-16 00:24:55.756465] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 3659000 0 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3659000 0 idle 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659000 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659000 -w 256 00:30:08.944 00:24:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659000 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.65 reactor_0' 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659000 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.65 reactor_0 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:30:09.203 00:24:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 3659000 00:30:09.203 00:24:55 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 3659000 ']' 00:30:09.203 00:24:55 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 3659000 00:30:09.203 00:24:55 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:09.203 00:24:55 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:09.203 00:24:55 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3659000 00:30:09.203 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:09.203 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:09.203 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3659000' 00:30:09.203 killing process with pid 3659000 00:30:09.203 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 3659000 00:30:09.203 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 3659000 00:30:09.463 00:24:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:30:09.463 00:24:56 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:09.463 00:24:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:30:09.463 00:24:56 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:09.463 00:24:56 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:09.463 00:24:56 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=3659632 00:30:09.463 00:24:56 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:09.463 00:24:56 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:09.463 00:24:56 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 3659632 /var/tmp/spdk.sock 00:30:09.463 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 3659632 ']' 00:30:09.463 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:09.463 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:09.463 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:09.463 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:09.463 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:09.463 00:24:56 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:09.463 [2024-07-16 00:24:56.365022] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:09.463 [2024-07-16 00:24:56.365161] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3659632 ] 00:30:09.722 [2024-07-16 00:24:56.564878] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:09.981 [2024-07-16 00:24:56.673712] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:09.981 [2024-07-16 00:24:56.673753] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:09.981 [2024-07-16 00:24:56.673755] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:09.981 [2024-07-16 00:24:56.756306] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:10.546 00:24:57 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:10.546 00:24:57 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:10.546 00:24:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:30:10.546 00:24:57 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:11.112 Malloc0 00:30:11.112 Malloc1 00:30:11.112 Malloc2 00:30:11.112 00:24:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:30:11.112 00:24:57 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:11.112 00:24:57 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:11.112 00:24:57 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:11.112 5000+0 records in 00:30:11.112 5000+0 records out 00:30:11.112 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0253392 s, 404 MB/s 00:30:11.113 00:24:57 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:11.370 AIO0 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 3659632 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 3659632 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=3659632 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:11.370 00:24:58 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:11.629 00:24:58 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:11.629 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:11.629 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:11.629 00:24:58 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:11.629 00:24:58 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:11.629 00:24:58 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:11.629 00:24:58 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:11.629 00:24:58 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:11.629 00:24:58 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:11.888 spdk_thread ids are 1 on reactor0. 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3659632 0 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3659632 0 idle 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659632 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659632 -w 256 00:30:11.888 00:24:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659632 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.50 reactor_0' 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659632 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.50 reactor_0 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3659632 1 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3659632 1 idle 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659632 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659632 -w 256 00:30:12.147 00:24:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659664 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659664 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3659632 2 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3659632 2 idle 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659632 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659632 -w 256 00:30:12.147 00:24:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659666 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659666 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:30:12.406 00:24:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:12.663 [2024-07-16 00:24:59.482502] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:12.663 [2024-07-16 00:24:59.482713] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:30:12.663 [2024-07-16 00:24:59.482836] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:12.663 00:24:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:12.921 [2024-07-16 00:24:59.678931] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:12.921 [2024-07-16 00:24:59.679114] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3659632 0 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3659632 0 busy 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659632 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659632 -w 256 00:30:12.921 00:24:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659632 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.90 reactor_0' 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659632 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.90 reactor_0 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3659632 2 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3659632 2 busy 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659632 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659632 -w 256 00:30:13.178 00:24:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659666 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2' 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659666 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:13.178 00:25:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:13.435 [2024-07-16 00:25:00.300715] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:13.435 [2024-07-16 00:25:00.300856] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 3659632 2 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3659632 2 idle 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659632 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659632 -w 256 00:30:13.435 00:25:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659666 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.61 reactor_2' 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659666 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.61 reactor_2 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:13.692 00:25:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:13.950 [2024-07-16 00:25:00.729809] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:13.950 [2024-07-16 00:25:00.730026] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:30:13.950 [2024-07-16 00:25:00.730053] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 3659632 0 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3659632 0 idle 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3659632 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3659632 -w 256 00:30:13.950 00:25:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3659632 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.75 reactor_0' 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3659632 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.75 reactor_0 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:30:14.208 00:25:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 3659632 00:30:14.208 00:25:00 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 3659632 ']' 00:30:14.208 00:25:00 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 3659632 00:30:14.208 00:25:00 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:14.208 00:25:00 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:14.208 00:25:00 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3659632 00:30:14.208 00:25:00 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:14.208 00:25:00 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:14.208 00:25:00 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3659632' 00:30:14.208 killing process with pid 3659632 00:30:14.208 00:25:00 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 3659632 00:30:14.208 00:25:00 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 3659632 00:30:14.467 00:25:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:30:14.467 00:25:01 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:14.467 00:30:14.467 real 0m10.386s 00:30:14.467 user 0m9.725s 00:30:14.467 sys 0m2.268s 00:30:14.467 00:25:01 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:14.467 00:25:01 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:14.467 ************************************ 00:30:14.467 END TEST reactor_set_interrupt 00:30:14.467 ************************************ 00:30:14.467 00:25:01 -- common/autotest_common.sh@1142 -- # return 0 00:30:14.467 00:25:01 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:14.467 00:25:01 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:14.467 00:25:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:14.467 00:25:01 -- common/autotest_common.sh@10 -- # set +x 00:30:14.467 ************************************ 00:30:14.467 START TEST reap_unregistered_poller 00:30:14.467 ************************************ 00:30:14.467 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:14.728 * Looking for test storage... 00:30:14.728 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:14.728 00:25:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:14.728 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:14.728 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:14.728 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:14.728 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:14.728 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:14.728 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:14.728 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:14.728 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:30:14.728 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:14.728 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:14.728 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:14.729 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:14.729 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:14.729 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:14.729 00:25:01 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:14.729 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:14.729 00:25:01 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:14.729 #define SPDK_CONFIG_H 00:30:14.729 #define SPDK_CONFIG_APPS 1 00:30:14.729 #define SPDK_CONFIG_ARCH native 00:30:14.729 #undef SPDK_CONFIG_ASAN 00:30:14.729 #undef SPDK_CONFIG_AVAHI 00:30:14.729 #undef SPDK_CONFIG_CET 00:30:14.729 #define SPDK_CONFIG_COVERAGE 1 00:30:14.729 #define SPDK_CONFIG_CROSS_PREFIX 00:30:14.729 #define SPDK_CONFIG_CRYPTO 1 00:30:14.729 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:14.729 #undef SPDK_CONFIG_CUSTOMOCF 00:30:14.729 #undef SPDK_CONFIG_DAOS 00:30:14.729 #define SPDK_CONFIG_DAOS_DIR 00:30:14.729 #define SPDK_CONFIG_DEBUG 1 00:30:14.729 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:14.729 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:14.729 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:14.729 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:14.729 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:14.729 #undef SPDK_CONFIG_DPDK_UADK 00:30:14.729 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:14.730 #define SPDK_CONFIG_EXAMPLES 1 00:30:14.730 #undef SPDK_CONFIG_FC 00:30:14.730 #define SPDK_CONFIG_FC_PATH 00:30:14.730 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:14.730 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:14.730 #undef SPDK_CONFIG_FUSE 00:30:14.730 #undef SPDK_CONFIG_FUZZER 00:30:14.730 #define SPDK_CONFIG_FUZZER_LIB 00:30:14.730 #undef SPDK_CONFIG_GOLANG 00:30:14.730 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:14.730 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:14.730 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:14.730 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:14.730 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:14.730 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:14.730 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:14.730 #define SPDK_CONFIG_IDXD 1 00:30:14.730 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:14.730 #define SPDK_CONFIG_IPSEC_MB 1 00:30:14.730 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:14.730 #define SPDK_CONFIG_ISAL 1 00:30:14.730 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:14.730 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:14.730 #define SPDK_CONFIG_LIBDIR 00:30:14.730 #undef SPDK_CONFIG_LTO 00:30:14.730 #define SPDK_CONFIG_MAX_LCORES 128 00:30:14.730 #define SPDK_CONFIG_NVME_CUSE 1 00:30:14.730 #undef SPDK_CONFIG_OCF 00:30:14.730 #define SPDK_CONFIG_OCF_PATH 00:30:14.730 #define SPDK_CONFIG_OPENSSL_PATH 00:30:14.730 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:14.730 #define SPDK_CONFIG_PGO_DIR 00:30:14.730 #undef SPDK_CONFIG_PGO_USE 00:30:14.730 #define SPDK_CONFIG_PREFIX /usr/local 00:30:14.730 #undef SPDK_CONFIG_RAID5F 00:30:14.730 #undef SPDK_CONFIG_RBD 00:30:14.730 #define SPDK_CONFIG_RDMA 1 00:30:14.730 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:14.730 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:14.730 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:14.730 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:14.730 #define SPDK_CONFIG_SHARED 1 00:30:14.730 #undef SPDK_CONFIG_SMA 00:30:14.730 #define SPDK_CONFIG_TESTS 1 00:30:14.730 #undef SPDK_CONFIG_TSAN 00:30:14.730 #define SPDK_CONFIG_UBLK 1 00:30:14.730 #define SPDK_CONFIG_UBSAN 1 00:30:14.730 #undef SPDK_CONFIG_UNIT_TESTS 00:30:14.730 #undef SPDK_CONFIG_URING 00:30:14.730 #define SPDK_CONFIG_URING_PATH 00:30:14.730 #undef SPDK_CONFIG_URING_ZNS 00:30:14.730 #undef SPDK_CONFIG_USDT 00:30:14.730 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:14.730 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:14.730 #undef SPDK_CONFIG_VFIO_USER 00:30:14.730 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:14.730 #define SPDK_CONFIG_VHOST 1 00:30:14.730 #define SPDK_CONFIG_VIRTIO 1 00:30:14.730 #undef SPDK_CONFIG_VTUNE 00:30:14.730 #define SPDK_CONFIG_VTUNE_DIR 00:30:14.730 #define SPDK_CONFIG_WERROR 1 00:30:14.730 #define SPDK_CONFIG_WPDK_DIR 00:30:14.730 #undef SPDK_CONFIG_XNVME 00:30:14.730 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:14.730 00:25:01 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:14.730 00:25:01 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:14.730 00:25:01 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:14.730 00:25:01 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:14.730 00:25:01 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:14.730 00:25:01 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:14.730 00:25:01 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:14.730 00:25:01 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:30:14.730 00:25:01 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:14.730 00:25:01 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:30:14.730 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:14.731 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 3660428 ]] 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 3660428 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.wm27Ep 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.wm27Ep/tests/interrupt /tmp/spdk.wm27Ep 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88574169088 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5934346240 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892300288 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9404416 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253352448 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=905216 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:14.732 * Looking for test storage... 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88574169088 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=8148938752 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:14.732 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:14.732 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=3660473 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 3660473 /var/tmp/spdk.sock 00:30:14.733 00:25:01 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 3660473 ']' 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:14.733 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:14.733 00:25:01 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:14.992 [2024-07-16 00:25:01.699019] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:14.992 [2024-07-16 00:25:01.699091] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3660473 ] 00:30:14.992 [2024-07-16 00:25:01.827506] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:14.992 [2024-07-16 00:25:01.934449] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:14.992 [2024-07-16 00:25:01.934550] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:14.992 [2024-07-16 00:25:01.934551] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:15.251 [2024-07-16 00:25:02.006298] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:15.821 00:25:02 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:15.821 00:25:02 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:30:15.821 00:25:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:30:15.821 00:25:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:30:15.821 00:25:02 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:15.821 00:25:02 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:15.821 00:25:02 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:15.821 00:25:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:30:15.821 "name": "app_thread", 00:30:15.821 "id": 1, 00:30:15.821 "active_pollers": [], 00:30:15.821 "timed_pollers": [ 00:30:15.821 { 00:30:15.821 "name": "rpc_subsystem_poll_servers", 00:30:15.821 "id": 1, 00:30:15.821 "state": "waiting", 00:30:15.821 "run_count": 0, 00:30:15.821 "busy_count": 0, 00:30:15.821 "period_ticks": 9200000 00:30:15.821 } 00:30:15.821 ], 00:30:15.821 "paused_pollers": [] 00:30:15.821 }' 00:30:15.821 00:25:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:30:15.821 00:25:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:30:15.821 00:25:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:30:15.821 00:25:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:30:16.080 00:25:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:30:16.080 00:25:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:30:16.080 00:25:02 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:30:16.080 00:25:02 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:16.080 00:25:02 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:16.080 5000+0 records in 00:30:16.080 5000+0 records out 00:30:16.080 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0274926 s, 372 MB/s 00:30:16.080 00:25:02 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:16.339 AIO0 00:30:16.339 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:16.598 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:30:16.598 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:30:16.598 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:30:16.598 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:16.598 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:16.598 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:16.598 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:30:16.598 "name": "app_thread", 00:30:16.598 "id": 1, 00:30:16.598 "active_pollers": [], 00:30:16.598 "timed_pollers": [ 00:30:16.598 { 00:30:16.598 "name": "rpc_subsystem_poll_servers", 00:30:16.598 "id": 1, 00:30:16.598 "state": "waiting", 00:30:16.598 "run_count": 0, 00:30:16.598 "busy_count": 0, 00:30:16.598 "period_ticks": 9200000 00:30:16.598 } 00:30:16.598 ], 00:30:16.598 "paused_pollers": [] 00:30:16.598 }' 00:30:16.598 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:30:16.598 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:30:16.598 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:30:16.598 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:30:16.857 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:30:16.857 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:30:16.857 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:30:16.857 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 3660473 00:30:16.857 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 3660473 ']' 00:30:16.857 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 3660473 00:30:16.857 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:30:16.857 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:16.857 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3660473 00:30:16.857 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:16.857 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:16.857 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3660473' 00:30:16.857 killing process with pid 3660473 00:30:16.857 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 3660473 00:30:16.857 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 3660473 00:30:17.117 00:25:03 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:30:17.117 00:25:03 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:17.117 00:30:17.117 real 0m2.524s 00:30:17.117 user 0m1.606s 00:30:17.117 sys 0m0.685s 00:30:17.117 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:17.117 00:25:03 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:17.117 ************************************ 00:30:17.117 END TEST reap_unregistered_poller 00:30:17.117 ************************************ 00:30:17.117 00:25:03 -- common/autotest_common.sh@1142 -- # return 0 00:30:17.117 00:25:03 -- spdk/autotest.sh@198 -- # uname -s 00:30:17.117 00:25:03 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:30:17.117 00:25:03 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:30:17.117 00:25:03 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:30:17.117 00:25:03 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@260 -- # timing_exit lib 00:30:17.117 00:25:03 -- common/autotest_common.sh@728 -- # xtrace_disable 00:30:17.117 00:25:03 -- common/autotest_common.sh@10 -- # set +x 00:30:17.117 00:25:03 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:30:17.117 00:25:03 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:17.117 00:25:03 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:17.117 00:25:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:17.117 00:25:03 -- common/autotest_common.sh@10 -- # set +x 00:30:17.117 ************************************ 00:30:17.117 START TEST compress_compdev 00:30:17.117 ************************************ 00:30:17.117 00:25:04 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:17.376 * Looking for test storage... 00:30:17.376 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:17.376 00:25:04 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:17.376 00:25:04 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:17.376 00:25:04 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:17.376 00:25:04 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:17.376 00:25:04 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:17.376 00:25:04 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:17.376 00:25:04 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:17.376 00:25:04 compress_compdev -- paths/export.sh@5 -- # export PATH 00:30:17.376 00:25:04 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:17.376 00:25:04 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:17.376 00:25:04 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:17.376 00:25:04 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:17.376 00:25:04 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:30:17.376 00:25:04 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:17.376 00:25:04 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:17.376 00:25:04 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=3660910 00:30:17.376 00:25:04 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:17.376 00:25:04 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 3660910 00:30:17.376 00:25:04 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 3660910 ']' 00:30:17.376 00:25:04 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:17.376 00:25:04 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:17.376 00:25:04 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:17.376 00:25:04 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:17.376 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:17.376 00:25:04 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:17.376 00:25:04 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:17.376 [2024-07-16 00:25:04.226502] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:17.376 [2024-07-16 00:25:04.226578] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3660910 ] 00:30:17.635 [2024-07-16 00:25:04.361118] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:17.635 [2024-07-16 00:25:04.478769] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:17.635 [2024-07-16 00:25:04.478773] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:18.570 [2024-07-16 00:25:05.457100] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:18.829 00:25:05 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:18.829 00:25:05 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:18.829 00:25:05 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:30:18.829 00:25:05 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:18.829 00:25:05 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:19.396 [2024-07-16 00:25:06.136426] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21ef3c0 PMD being used: compress_qat 00:30:19.396 00:25:06 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:19.396 00:25:06 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:19.396 00:25:06 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:19.396 00:25:06 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:19.396 00:25:06 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:19.396 00:25:06 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:19.396 00:25:06 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:19.654 00:25:06 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:19.912 [ 00:30:19.912 { 00:30:19.912 "name": "Nvme0n1", 00:30:19.912 "aliases": [ 00:30:19.912 "01000000-0000-0000-5cd2-e43197705251" 00:30:19.912 ], 00:30:19.912 "product_name": "NVMe disk", 00:30:19.912 "block_size": 512, 00:30:19.912 "num_blocks": 15002931888, 00:30:19.912 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:19.912 "assigned_rate_limits": { 00:30:19.912 "rw_ios_per_sec": 0, 00:30:19.912 "rw_mbytes_per_sec": 0, 00:30:19.912 "r_mbytes_per_sec": 0, 00:30:19.912 "w_mbytes_per_sec": 0 00:30:19.912 }, 00:30:19.912 "claimed": false, 00:30:19.912 "zoned": false, 00:30:19.912 "supported_io_types": { 00:30:19.912 "read": true, 00:30:19.912 "write": true, 00:30:19.912 "unmap": true, 00:30:19.912 "flush": true, 00:30:19.912 "reset": true, 00:30:19.913 "nvme_admin": true, 00:30:19.913 "nvme_io": true, 00:30:19.913 "nvme_io_md": false, 00:30:19.913 "write_zeroes": true, 00:30:19.913 "zcopy": false, 00:30:19.913 "get_zone_info": false, 00:30:19.913 "zone_management": false, 00:30:19.913 "zone_append": false, 00:30:19.913 "compare": false, 00:30:19.913 "compare_and_write": false, 00:30:19.913 "abort": true, 00:30:19.913 "seek_hole": false, 00:30:19.913 "seek_data": false, 00:30:19.913 "copy": false, 00:30:19.913 "nvme_iov_md": false 00:30:19.913 }, 00:30:19.913 "driver_specific": { 00:30:19.913 "nvme": [ 00:30:19.913 { 00:30:19.913 "pci_address": "0000:5e:00.0", 00:30:19.913 "trid": { 00:30:19.913 "trtype": "PCIe", 00:30:19.913 "traddr": "0000:5e:00.0" 00:30:19.913 }, 00:30:19.913 "ctrlr_data": { 00:30:19.913 "cntlid": 0, 00:30:19.913 "vendor_id": "0x8086", 00:30:19.913 "model_number": "INTEL SSDPF2KX076TZO", 00:30:19.913 "serial_number": "PHAC0301002G7P6CGN", 00:30:19.913 "firmware_revision": "JCV10200", 00:30:19.913 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:19.913 "oacs": { 00:30:19.913 "security": 1, 00:30:19.913 "format": 1, 00:30:19.913 "firmware": 1, 00:30:19.913 "ns_manage": 1 00:30:19.913 }, 00:30:19.913 "multi_ctrlr": false, 00:30:19.913 "ana_reporting": false 00:30:19.913 }, 00:30:19.913 "vs": { 00:30:19.913 "nvme_version": "1.3" 00:30:19.913 }, 00:30:19.913 "ns_data": { 00:30:19.913 "id": 1, 00:30:19.913 "can_share": false 00:30:19.913 }, 00:30:19.913 "security": { 00:30:19.913 "opal": true 00:30:19.913 } 00:30:19.913 } 00:30:19.913 ], 00:30:19.913 "mp_policy": "active_passive" 00:30:19.913 } 00:30:19.913 } 00:30:19.913 ] 00:30:19.913 00:25:06 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:19.913 00:25:06 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:20.170 [2024-07-16 00:25:06.886779] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20540d0 PMD being used: compress_qat 00:30:22.698 a9851952-ef0a-477b-bf51-b18891a47efb 00:30:22.698 00:25:09 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:22.698 1b5f0926-3d61-43af-9cc0-230e4d5025f7 00:30:22.698 00:25:09 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:22.698 00:25:09 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:22.698 00:25:09 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:22.698 00:25:09 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:22.698 00:25:09 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:22.698 00:25:09 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:22.698 00:25:09 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:22.698 00:25:09 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:22.955 [ 00:30:22.955 { 00:30:22.955 "name": "1b5f0926-3d61-43af-9cc0-230e4d5025f7", 00:30:22.955 "aliases": [ 00:30:22.955 "lvs0/lv0" 00:30:22.955 ], 00:30:22.955 "product_name": "Logical Volume", 00:30:22.955 "block_size": 512, 00:30:22.955 "num_blocks": 204800, 00:30:22.955 "uuid": "1b5f0926-3d61-43af-9cc0-230e4d5025f7", 00:30:22.955 "assigned_rate_limits": { 00:30:22.955 "rw_ios_per_sec": 0, 00:30:22.955 "rw_mbytes_per_sec": 0, 00:30:22.955 "r_mbytes_per_sec": 0, 00:30:22.955 "w_mbytes_per_sec": 0 00:30:22.955 }, 00:30:22.955 "claimed": false, 00:30:22.955 "zoned": false, 00:30:22.955 "supported_io_types": { 00:30:22.955 "read": true, 00:30:22.955 "write": true, 00:30:22.955 "unmap": true, 00:30:22.955 "flush": false, 00:30:22.955 "reset": true, 00:30:22.955 "nvme_admin": false, 00:30:22.955 "nvme_io": false, 00:30:22.955 "nvme_io_md": false, 00:30:22.955 "write_zeroes": true, 00:30:22.955 "zcopy": false, 00:30:22.955 "get_zone_info": false, 00:30:22.955 "zone_management": false, 00:30:22.955 "zone_append": false, 00:30:22.955 "compare": false, 00:30:22.955 "compare_and_write": false, 00:30:22.955 "abort": false, 00:30:22.955 "seek_hole": true, 00:30:22.955 "seek_data": true, 00:30:22.955 "copy": false, 00:30:22.955 "nvme_iov_md": false 00:30:22.955 }, 00:30:22.955 "driver_specific": { 00:30:22.955 "lvol": { 00:30:22.955 "lvol_store_uuid": "a9851952-ef0a-477b-bf51-b18891a47efb", 00:30:22.955 "base_bdev": "Nvme0n1", 00:30:22.955 "thin_provision": true, 00:30:22.955 "num_allocated_clusters": 0, 00:30:22.955 "snapshot": false, 00:30:22.955 "clone": false, 00:30:22.955 "esnap_clone": false 00:30:22.955 } 00:30:22.955 } 00:30:22.955 } 00:30:22.955 ] 00:30:22.955 00:25:09 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:22.955 00:25:09 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:22.955 00:25:09 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:23.213 [2024-07-16 00:25:10.054614] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:23.213 COMP_lvs0/lv0 00:30:23.213 00:25:10 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:23.213 00:25:10 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:23.213 00:25:10 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:23.213 00:25:10 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:23.213 00:25:10 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:23.213 00:25:10 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:23.213 00:25:10 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:23.472 00:25:10 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:23.734 [ 00:30:23.734 { 00:30:23.734 "name": "COMP_lvs0/lv0", 00:30:23.734 "aliases": [ 00:30:23.734 "1199b09a-63d1-50b7-8592-011bb3ac9a30" 00:30:23.734 ], 00:30:23.734 "product_name": "compress", 00:30:23.734 "block_size": 512, 00:30:23.734 "num_blocks": 200704, 00:30:23.734 "uuid": "1199b09a-63d1-50b7-8592-011bb3ac9a30", 00:30:23.734 "assigned_rate_limits": { 00:30:23.734 "rw_ios_per_sec": 0, 00:30:23.734 "rw_mbytes_per_sec": 0, 00:30:23.734 "r_mbytes_per_sec": 0, 00:30:23.734 "w_mbytes_per_sec": 0 00:30:23.734 }, 00:30:23.734 "claimed": false, 00:30:23.734 "zoned": false, 00:30:23.734 "supported_io_types": { 00:30:23.734 "read": true, 00:30:23.734 "write": true, 00:30:23.734 "unmap": false, 00:30:23.734 "flush": false, 00:30:23.734 "reset": false, 00:30:23.734 "nvme_admin": false, 00:30:23.734 "nvme_io": false, 00:30:23.734 "nvme_io_md": false, 00:30:23.734 "write_zeroes": true, 00:30:23.734 "zcopy": false, 00:30:23.734 "get_zone_info": false, 00:30:23.734 "zone_management": false, 00:30:23.734 "zone_append": false, 00:30:23.734 "compare": false, 00:30:23.734 "compare_and_write": false, 00:30:23.734 "abort": false, 00:30:23.734 "seek_hole": false, 00:30:23.734 "seek_data": false, 00:30:23.734 "copy": false, 00:30:23.734 "nvme_iov_md": false 00:30:23.734 }, 00:30:23.734 "driver_specific": { 00:30:23.734 "compress": { 00:30:23.734 "name": "COMP_lvs0/lv0", 00:30:23.734 "base_bdev_name": "1b5f0926-3d61-43af-9cc0-230e4d5025f7" 00:30:23.734 } 00:30:23.734 } 00:30:23.734 } 00:30:23.734 ] 00:30:23.734 00:25:10 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:23.734 00:25:10 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:23.734 [2024-07-16 00:25:10.680942] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fabf41b15c0 PMD being used: compress_qat 00:30:23.734 [2024-07-16 00:25:10.684212] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21ec6e0 PMD being used: compress_qat 00:30:23.992 Running I/O for 3 seconds... 00:30:27.272 00:30:27.272 Latency(us) 00:30:27.272 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:27.272 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:27.272 Verification LBA range: start 0x0 length 0x3100 00:30:27.272 COMP_lvs0/lv0 : 3.01 1676.74 6.55 0.00 0.00 18992.02 2080.06 17438.27 00:30:27.272 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:27.272 Verification LBA range: start 0x3100 length 0x3100 00:30:27.272 COMP_lvs0/lv0 : 3.01 1777.99 6.95 0.00 0.00 17885.04 1367.71 14531.90 00:30:27.272 =================================================================================================================== 00:30:27.272 Total : 3454.73 13.50 0.00 0.00 18422.29 1367.71 17438.27 00:30:27.272 0 00:30:27.272 00:25:13 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:27.272 00:25:13 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:27.272 00:25:13 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:27.272 00:25:14 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:27.272 00:25:14 compress_compdev -- compress/compress.sh@78 -- # killprocess 3660910 00:30:27.272 00:25:14 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 3660910 ']' 00:30:27.272 00:25:14 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 3660910 00:30:27.272 00:25:14 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:27.272 00:25:14 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:27.272 00:25:14 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3660910 00:30:27.272 00:25:14 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:27.272 00:25:14 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:27.272 00:25:14 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3660910' 00:30:27.272 killing process with pid 3660910 00:30:27.530 00:25:14 compress_compdev -- common/autotest_common.sh@967 -- # kill 3660910 00:30:27.530 Received shutdown signal, test time was about 3.000000 seconds 00:30:27.530 00:30:27.530 Latency(us) 00:30:27.530 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:27.530 =================================================================================================================== 00:30:27.530 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:27.530 00:25:14 compress_compdev -- common/autotest_common.sh@972 -- # wait 3660910 00:30:30.807 00:25:17 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:30.807 00:25:17 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:30.807 00:25:17 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=3662528 00:30:30.807 00:25:17 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:30.807 00:25:17 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:30.807 00:25:17 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 3662528 00:30:30.807 00:25:17 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 3662528 ']' 00:30:30.807 00:25:17 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:30.807 00:25:17 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:30.807 00:25:17 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:30.807 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:30.807 00:25:17 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:30.807 00:25:17 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:30.807 [2024-07-16 00:25:17.351764] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:30.807 [2024-07-16 00:25:17.351833] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3662528 ] 00:30:30.807 [2024-07-16 00:25:17.487177] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:30.808 [2024-07-16 00:25:17.603153] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:30.808 [2024-07-16 00:25:17.603158] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:31.738 [2024-07-16 00:25:18.578933] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:31.738 00:25:18 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:31.738 00:25:18 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:31.738 00:25:18 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:30:31.738 00:25:18 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:31.738 00:25:18 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:32.670 [2024-07-16 00:25:19.272621] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x27f33c0 PMD being used: compress_qat 00:30:32.670 00:25:19 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:32.670 00:25:19 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:32.670 00:25:19 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:32.670 00:25:19 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:32.670 00:25:19 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:32.670 00:25:19 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:32.670 00:25:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:32.670 00:25:19 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:32.927 [ 00:30:32.927 { 00:30:32.927 "name": "Nvme0n1", 00:30:32.927 "aliases": [ 00:30:32.927 "01000000-0000-0000-5cd2-e43197705251" 00:30:32.927 ], 00:30:32.927 "product_name": "NVMe disk", 00:30:32.927 "block_size": 512, 00:30:32.927 "num_blocks": 15002931888, 00:30:32.927 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:32.927 "assigned_rate_limits": { 00:30:32.927 "rw_ios_per_sec": 0, 00:30:32.927 "rw_mbytes_per_sec": 0, 00:30:32.927 "r_mbytes_per_sec": 0, 00:30:32.927 "w_mbytes_per_sec": 0 00:30:32.927 }, 00:30:32.927 "claimed": false, 00:30:32.927 "zoned": false, 00:30:32.927 "supported_io_types": { 00:30:32.927 "read": true, 00:30:32.927 "write": true, 00:30:32.927 "unmap": true, 00:30:32.927 "flush": true, 00:30:32.927 "reset": true, 00:30:32.927 "nvme_admin": true, 00:30:32.927 "nvme_io": true, 00:30:32.927 "nvme_io_md": false, 00:30:32.927 "write_zeroes": true, 00:30:32.927 "zcopy": false, 00:30:32.927 "get_zone_info": false, 00:30:32.927 "zone_management": false, 00:30:32.927 "zone_append": false, 00:30:32.927 "compare": false, 00:30:32.927 "compare_and_write": false, 00:30:32.927 "abort": true, 00:30:32.927 "seek_hole": false, 00:30:32.927 "seek_data": false, 00:30:32.927 "copy": false, 00:30:32.927 "nvme_iov_md": false 00:30:32.927 }, 00:30:32.927 "driver_specific": { 00:30:32.927 "nvme": [ 00:30:32.927 { 00:30:32.927 "pci_address": "0000:5e:00.0", 00:30:32.927 "trid": { 00:30:32.927 "trtype": "PCIe", 00:30:32.927 "traddr": "0000:5e:00.0" 00:30:32.927 }, 00:30:32.927 "ctrlr_data": { 00:30:32.927 "cntlid": 0, 00:30:32.927 "vendor_id": "0x8086", 00:30:32.927 "model_number": "INTEL SSDPF2KX076TZO", 00:30:32.927 "serial_number": "PHAC0301002G7P6CGN", 00:30:32.927 "firmware_revision": "JCV10200", 00:30:32.927 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:32.927 "oacs": { 00:30:32.927 "security": 1, 00:30:32.927 "format": 1, 00:30:32.928 "firmware": 1, 00:30:32.928 "ns_manage": 1 00:30:32.928 }, 00:30:32.928 "multi_ctrlr": false, 00:30:32.928 "ana_reporting": false 00:30:32.928 }, 00:30:32.928 "vs": { 00:30:32.928 "nvme_version": "1.3" 00:30:32.928 }, 00:30:32.928 "ns_data": { 00:30:32.928 "id": 1, 00:30:32.928 "can_share": false 00:30:32.928 }, 00:30:32.928 "security": { 00:30:32.928 "opal": true 00:30:32.928 } 00:30:32.928 } 00:30:32.928 ], 00:30:32.928 "mp_policy": "active_passive" 00:30:32.928 } 00:30:32.928 } 00:30:32.928 ] 00:30:32.928 00:25:19 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:32.928 00:25:19 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:33.185 [2024-07-16 00:25:20.055383] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2658660 PMD being used: compress_qat 00:30:35.709 35e95037-a5ae-49df-86f7-70f03593df16 00:30:35.709 00:25:22 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:35.709 17e2e246-5f3d-4556-b6c4-12a2d4d48ec3 00:30:35.709 00:25:22 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:35.709 00:25:22 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:35.709 00:25:22 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:35.709 00:25:22 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:35.709 00:25:22 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:35.709 00:25:22 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:35.709 00:25:22 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:35.966 00:25:22 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:36.223 [ 00:30:36.223 { 00:30:36.223 "name": "17e2e246-5f3d-4556-b6c4-12a2d4d48ec3", 00:30:36.223 "aliases": [ 00:30:36.223 "lvs0/lv0" 00:30:36.223 ], 00:30:36.223 "product_name": "Logical Volume", 00:30:36.224 "block_size": 512, 00:30:36.224 "num_blocks": 204800, 00:30:36.224 "uuid": "17e2e246-5f3d-4556-b6c4-12a2d4d48ec3", 00:30:36.224 "assigned_rate_limits": { 00:30:36.224 "rw_ios_per_sec": 0, 00:30:36.224 "rw_mbytes_per_sec": 0, 00:30:36.224 "r_mbytes_per_sec": 0, 00:30:36.224 "w_mbytes_per_sec": 0 00:30:36.224 }, 00:30:36.224 "claimed": false, 00:30:36.224 "zoned": false, 00:30:36.224 "supported_io_types": { 00:30:36.224 "read": true, 00:30:36.224 "write": true, 00:30:36.224 "unmap": true, 00:30:36.224 "flush": false, 00:30:36.224 "reset": true, 00:30:36.224 "nvme_admin": false, 00:30:36.224 "nvme_io": false, 00:30:36.224 "nvme_io_md": false, 00:30:36.224 "write_zeroes": true, 00:30:36.224 "zcopy": false, 00:30:36.224 "get_zone_info": false, 00:30:36.224 "zone_management": false, 00:30:36.224 "zone_append": false, 00:30:36.224 "compare": false, 00:30:36.224 "compare_and_write": false, 00:30:36.224 "abort": false, 00:30:36.224 "seek_hole": true, 00:30:36.224 "seek_data": true, 00:30:36.224 "copy": false, 00:30:36.224 "nvme_iov_md": false 00:30:36.224 }, 00:30:36.224 "driver_specific": { 00:30:36.224 "lvol": { 00:30:36.224 "lvol_store_uuid": "35e95037-a5ae-49df-86f7-70f03593df16", 00:30:36.224 "base_bdev": "Nvme0n1", 00:30:36.224 "thin_provision": true, 00:30:36.224 "num_allocated_clusters": 0, 00:30:36.224 "snapshot": false, 00:30:36.224 "clone": false, 00:30:36.224 "esnap_clone": false 00:30:36.224 } 00:30:36.224 } 00:30:36.224 } 00:30:36.224 ] 00:30:36.224 00:25:23 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:36.224 00:25:23 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:36.224 00:25:23 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:36.481 [2024-07-16 00:25:23.252216] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:36.481 COMP_lvs0/lv0 00:30:36.481 00:25:23 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:36.481 00:25:23 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:36.481 00:25:23 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:36.481 00:25:23 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:36.481 00:25:23 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:36.481 00:25:23 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:36.481 00:25:23 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:36.738 00:25:23 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:36.997 [ 00:30:36.997 { 00:30:36.997 "name": "COMP_lvs0/lv0", 00:30:36.997 "aliases": [ 00:30:36.997 "716d2be3-0897-5fea-a842-6d0b5956878b" 00:30:36.997 ], 00:30:36.997 "product_name": "compress", 00:30:36.997 "block_size": 512, 00:30:36.997 "num_blocks": 200704, 00:30:36.997 "uuid": "716d2be3-0897-5fea-a842-6d0b5956878b", 00:30:36.997 "assigned_rate_limits": { 00:30:36.997 "rw_ios_per_sec": 0, 00:30:36.997 "rw_mbytes_per_sec": 0, 00:30:36.997 "r_mbytes_per_sec": 0, 00:30:36.997 "w_mbytes_per_sec": 0 00:30:36.997 }, 00:30:36.997 "claimed": false, 00:30:36.997 "zoned": false, 00:30:36.997 "supported_io_types": { 00:30:36.997 "read": true, 00:30:36.997 "write": true, 00:30:36.997 "unmap": false, 00:30:36.997 "flush": false, 00:30:36.997 "reset": false, 00:30:36.997 "nvme_admin": false, 00:30:36.997 "nvme_io": false, 00:30:36.997 "nvme_io_md": false, 00:30:36.997 "write_zeroes": true, 00:30:36.997 "zcopy": false, 00:30:36.997 "get_zone_info": false, 00:30:36.997 "zone_management": false, 00:30:36.997 "zone_append": false, 00:30:36.997 "compare": false, 00:30:36.997 "compare_and_write": false, 00:30:36.997 "abort": false, 00:30:36.997 "seek_hole": false, 00:30:36.997 "seek_data": false, 00:30:36.997 "copy": false, 00:30:36.997 "nvme_iov_md": false 00:30:36.997 }, 00:30:36.997 "driver_specific": { 00:30:36.997 "compress": { 00:30:36.997 "name": "COMP_lvs0/lv0", 00:30:36.997 "base_bdev_name": "17e2e246-5f3d-4556-b6c4-12a2d4d48ec3" 00:30:36.997 } 00:30:36.997 } 00:30:36.997 } 00:30:36.997 ] 00:30:36.997 00:25:23 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:36.997 00:25:23 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:36.997 [2024-07-16 00:25:23.926719] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1b341b15c0 PMD being used: compress_qat 00:30:36.997 [2024-07-16 00:25:23.929902] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x27f0770 PMD being used: compress_qat 00:30:36.997 Running I/O for 3 seconds... 00:30:40.292 00:30:40.292 Latency(us) 00:30:40.292 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:40.292 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:40.292 Verification LBA range: start 0x0 length 0x3100 00:30:40.292 COMP_lvs0/lv0 : 3.01 1685.44 6.58 0.00 0.00 18885.06 2350.75 17438.27 00:30:40.292 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:40.292 Verification LBA range: start 0x3100 length 0x3100 00:30:40.292 COMP_lvs0/lv0 : 3.01 1789.51 6.99 0.00 0.00 17767.22 1317.84 14702.86 00:30:40.292 =================================================================================================================== 00:30:40.292 Total : 3474.94 13.57 0.00 0.00 18309.64 1317.84 17438.27 00:30:40.292 0 00:30:40.292 00:25:26 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:40.292 00:25:26 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:40.550 00:25:27 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:40.808 00:25:27 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:40.808 00:25:27 compress_compdev -- compress/compress.sh@78 -- # killprocess 3662528 00:30:40.808 00:25:27 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 3662528 ']' 00:30:40.808 00:25:27 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 3662528 00:30:40.808 00:25:27 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:40.808 00:25:27 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:40.808 00:25:27 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3662528 00:30:40.808 00:25:27 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:40.808 00:25:27 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:40.808 00:25:27 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3662528' 00:30:40.808 killing process with pid 3662528 00:30:40.808 00:25:27 compress_compdev -- common/autotest_common.sh@967 -- # kill 3662528 00:30:40.808 Received shutdown signal, test time was about 3.000000 seconds 00:30:40.808 00:30:40.808 Latency(us) 00:30:40.808 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:40.808 =================================================================================================================== 00:30:40.808 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:40.808 00:25:27 compress_compdev -- common/autotest_common.sh@972 -- # wait 3662528 00:30:44.087 00:25:30 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:44.087 00:25:30 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:44.087 00:25:30 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=3664285 00:30:44.087 00:25:30 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:44.087 00:25:30 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 3664285 00:30:44.087 00:25:30 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 3664285 ']' 00:30:44.087 00:25:30 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:44.087 00:25:30 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:44.087 00:25:30 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:44.087 00:25:30 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:44.087 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:44.087 00:25:30 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:44.087 00:25:30 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:44.087 [2024-07-16 00:25:30.706113] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:44.087 [2024-07-16 00:25:30.706192] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3664285 ] 00:30:44.087 [2024-07-16 00:25:30.840590] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:44.087 [2024-07-16 00:25:30.973374] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:44.087 [2024-07-16 00:25:30.973382] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:45.026 [2024-07-16 00:25:31.938313] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:45.373 00:25:32 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:45.373 00:25:32 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:45.373 00:25:32 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:30:45.373 00:25:32 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:45.373 00:25:32 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:45.940 [2024-07-16 00:25:32.624561] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24353c0 PMD being used: compress_qat 00:30:45.940 00:25:32 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:45.940 00:25:32 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:45.940 00:25:32 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:45.940 00:25:32 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:45.940 00:25:32 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:45.940 00:25:32 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:45.940 00:25:32 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:46.197 00:25:32 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:46.454 [ 00:30:46.454 { 00:30:46.454 "name": "Nvme0n1", 00:30:46.454 "aliases": [ 00:30:46.454 "01000000-0000-0000-5cd2-e43197705251" 00:30:46.454 ], 00:30:46.454 "product_name": "NVMe disk", 00:30:46.454 "block_size": 512, 00:30:46.454 "num_blocks": 15002931888, 00:30:46.454 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:46.454 "assigned_rate_limits": { 00:30:46.454 "rw_ios_per_sec": 0, 00:30:46.454 "rw_mbytes_per_sec": 0, 00:30:46.454 "r_mbytes_per_sec": 0, 00:30:46.454 "w_mbytes_per_sec": 0 00:30:46.454 }, 00:30:46.454 "claimed": false, 00:30:46.454 "zoned": false, 00:30:46.454 "supported_io_types": { 00:30:46.454 "read": true, 00:30:46.454 "write": true, 00:30:46.454 "unmap": true, 00:30:46.454 "flush": true, 00:30:46.454 "reset": true, 00:30:46.454 "nvme_admin": true, 00:30:46.454 "nvme_io": true, 00:30:46.454 "nvme_io_md": false, 00:30:46.454 "write_zeroes": true, 00:30:46.454 "zcopy": false, 00:30:46.454 "get_zone_info": false, 00:30:46.454 "zone_management": false, 00:30:46.454 "zone_append": false, 00:30:46.454 "compare": false, 00:30:46.454 "compare_and_write": false, 00:30:46.454 "abort": true, 00:30:46.455 "seek_hole": false, 00:30:46.455 "seek_data": false, 00:30:46.455 "copy": false, 00:30:46.455 "nvme_iov_md": false 00:30:46.455 }, 00:30:46.455 "driver_specific": { 00:30:46.455 "nvme": [ 00:30:46.455 { 00:30:46.455 "pci_address": "0000:5e:00.0", 00:30:46.455 "trid": { 00:30:46.455 "trtype": "PCIe", 00:30:46.455 "traddr": "0000:5e:00.0" 00:30:46.455 }, 00:30:46.455 "ctrlr_data": { 00:30:46.455 "cntlid": 0, 00:30:46.455 "vendor_id": "0x8086", 00:30:46.455 "model_number": "INTEL SSDPF2KX076TZO", 00:30:46.455 "serial_number": "PHAC0301002G7P6CGN", 00:30:46.455 "firmware_revision": "JCV10200", 00:30:46.455 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:46.455 "oacs": { 00:30:46.455 "security": 1, 00:30:46.455 "format": 1, 00:30:46.455 "firmware": 1, 00:30:46.455 "ns_manage": 1 00:30:46.455 }, 00:30:46.455 "multi_ctrlr": false, 00:30:46.455 "ana_reporting": false 00:30:46.455 }, 00:30:46.455 "vs": { 00:30:46.455 "nvme_version": "1.3" 00:30:46.455 }, 00:30:46.455 "ns_data": { 00:30:46.455 "id": 1, 00:30:46.455 "can_share": false 00:30:46.455 }, 00:30:46.455 "security": { 00:30:46.455 "opal": true 00:30:46.455 } 00:30:46.455 } 00:30:46.455 ], 00:30:46.455 "mp_policy": "active_passive" 00:30:46.455 } 00:30:46.455 } 00:30:46.455 ] 00:30:46.455 00:25:33 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:46.455 00:25:33 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:46.711 [2024-07-16 00:25:33.415553] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x229a0d0 PMD being used: compress_qat 00:30:49.238 c50fffdf-10e3-4ed6-a4c4-50e762bf36c9 00:30:49.238 00:25:35 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:49.238 29dca341-a913-4e81-9e4b-0f9933b89cdc 00:30:49.238 00:25:35 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:49.238 00:25:35 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:49.238 00:25:35 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:49.238 00:25:35 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:49.238 00:25:35 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:49.238 00:25:35 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:49.238 00:25:35 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:49.238 00:25:36 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:49.496 [ 00:30:49.496 { 00:30:49.496 "name": "29dca341-a913-4e81-9e4b-0f9933b89cdc", 00:30:49.496 "aliases": [ 00:30:49.496 "lvs0/lv0" 00:30:49.496 ], 00:30:49.496 "product_name": "Logical Volume", 00:30:49.496 "block_size": 512, 00:30:49.496 "num_blocks": 204800, 00:30:49.496 "uuid": "29dca341-a913-4e81-9e4b-0f9933b89cdc", 00:30:49.496 "assigned_rate_limits": { 00:30:49.496 "rw_ios_per_sec": 0, 00:30:49.496 "rw_mbytes_per_sec": 0, 00:30:49.496 "r_mbytes_per_sec": 0, 00:30:49.496 "w_mbytes_per_sec": 0 00:30:49.496 }, 00:30:49.496 "claimed": false, 00:30:49.496 "zoned": false, 00:30:49.496 "supported_io_types": { 00:30:49.496 "read": true, 00:30:49.496 "write": true, 00:30:49.496 "unmap": true, 00:30:49.496 "flush": false, 00:30:49.496 "reset": true, 00:30:49.496 "nvme_admin": false, 00:30:49.496 "nvme_io": false, 00:30:49.496 "nvme_io_md": false, 00:30:49.496 "write_zeroes": true, 00:30:49.496 "zcopy": false, 00:30:49.496 "get_zone_info": false, 00:30:49.496 "zone_management": false, 00:30:49.496 "zone_append": false, 00:30:49.496 "compare": false, 00:30:49.496 "compare_and_write": false, 00:30:49.496 "abort": false, 00:30:49.496 "seek_hole": true, 00:30:49.496 "seek_data": true, 00:30:49.496 "copy": false, 00:30:49.496 "nvme_iov_md": false 00:30:49.496 }, 00:30:49.496 "driver_specific": { 00:30:49.496 "lvol": { 00:30:49.496 "lvol_store_uuid": "c50fffdf-10e3-4ed6-a4c4-50e762bf36c9", 00:30:49.496 "base_bdev": "Nvme0n1", 00:30:49.496 "thin_provision": true, 00:30:49.496 "num_allocated_clusters": 0, 00:30:49.496 "snapshot": false, 00:30:49.496 "clone": false, 00:30:49.496 "esnap_clone": false 00:30:49.496 } 00:30:49.496 } 00:30:49.496 } 00:30:49.496 ] 00:30:49.496 00:25:36 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:49.496 00:25:36 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:49.496 00:25:36 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:49.754 [2024-07-16 00:25:36.669193] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:49.754 COMP_lvs0/lv0 00:30:49.754 00:25:36 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:49.754 00:25:36 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:49.754 00:25:36 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:49.754 00:25:36 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:49.754 00:25:36 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:49.754 00:25:36 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:49.754 00:25:36 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:50.320 00:25:36 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:50.320 [ 00:30:50.320 { 00:30:50.320 "name": "COMP_lvs0/lv0", 00:30:50.320 "aliases": [ 00:30:50.320 "2f4eaefc-99e5-52c5-b34b-786a14c7ff17" 00:30:50.320 ], 00:30:50.320 "product_name": "compress", 00:30:50.320 "block_size": 4096, 00:30:50.320 "num_blocks": 25088, 00:30:50.320 "uuid": "2f4eaefc-99e5-52c5-b34b-786a14c7ff17", 00:30:50.320 "assigned_rate_limits": { 00:30:50.320 "rw_ios_per_sec": 0, 00:30:50.320 "rw_mbytes_per_sec": 0, 00:30:50.320 "r_mbytes_per_sec": 0, 00:30:50.320 "w_mbytes_per_sec": 0 00:30:50.320 }, 00:30:50.320 "claimed": false, 00:30:50.320 "zoned": false, 00:30:50.320 "supported_io_types": { 00:30:50.320 "read": true, 00:30:50.320 "write": true, 00:30:50.320 "unmap": false, 00:30:50.320 "flush": false, 00:30:50.320 "reset": false, 00:30:50.320 "nvme_admin": false, 00:30:50.320 "nvme_io": false, 00:30:50.320 "nvme_io_md": false, 00:30:50.320 "write_zeroes": true, 00:30:50.320 "zcopy": false, 00:30:50.320 "get_zone_info": false, 00:30:50.320 "zone_management": false, 00:30:50.320 "zone_append": false, 00:30:50.320 "compare": false, 00:30:50.320 "compare_and_write": false, 00:30:50.320 "abort": false, 00:30:50.320 "seek_hole": false, 00:30:50.320 "seek_data": false, 00:30:50.320 "copy": false, 00:30:50.320 "nvme_iov_md": false 00:30:50.320 }, 00:30:50.320 "driver_specific": { 00:30:50.320 "compress": { 00:30:50.320 "name": "COMP_lvs0/lv0", 00:30:50.320 "base_bdev_name": "29dca341-a913-4e81-9e4b-0f9933b89cdc" 00:30:50.320 } 00:30:50.320 } 00:30:50.320 } 00:30:50.320 ] 00:30:50.320 00:25:37 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:50.320 00:25:37 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:50.578 [2024-07-16 00:25:37.279652] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f49c41b15c0 PMD being used: compress_qat 00:30:50.578 [2024-07-16 00:25:37.282855] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2432700 PMD being used: compress_qat 00:30:50.578 Running I/O for 3 seconds... 00:30:53.863 00:30:53.863 Latency(us) 00:30:53.863 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:53.863 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:53.863 Verification LBA range: start 0x0 length 0x3100 00:30:53.863 COMP_lvs0/lv0 : 3.01 1676.30 6.55 0.00 0.00 18985.49 2350.75 18919.96 00:30:53.863 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:53.863 Verification LBA range: start 0x3100 length 0x3100 00:30:53.863 COMP_lvs0/lv0 : 3.01 1785.89 6.98 0.00 0.00 17805.17 1317.84 15956.59 00:30:53.863 =================================================================================================================== 00:30:53.863 Total : 3462.19 13.52 0.00 0.00 18376.88 1317.84 18919.96 00:30:53.863 0 00:30:53.863 00:25:40 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:53.863 00:25:40 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:53.863 00:25:40 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:54.121 00:25:40 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:54.121 00:25:40 compress_compdev -- compress/compress.sh@78 -- # killprocess 3664285 00:30:54.121 00:25:40 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 3664285 ']' 00:30:54.121 00:25:40 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 3664285 00:30:54.121 00:25:40 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:54.121 00:25:40 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:54.121 00:25:40 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3664285 00:30:54.121 00:25:40 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:54.121 00:25:40 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:54.121 00:25:40 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3664285' 00:30:54.121 killing process with pid 3664285 00:30:54.121 00:25:40 compress_compdev -- common/autotest_common.sh@967 -- # kill 3664285 00:30:54.121 Received shutdown signal, test time was about 3.000000 seconds 00:30:54.121 00:30:54.121 Latency(us) 00:30:54.121 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:54.121 =================================================================================================================== 00:30:54.121 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:54.121 00:25:40 compress_compdev -- common/autotest_common.sh@972 -- # wait 3664285 00:30:57.399 00:25:43 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:30:57.399 00:25:43 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:57.399 00:25:43 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=3666050 00:30:57.399 00:25:43 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:57.399 00:25:43 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:30:57.399 00:25:43 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 3666050 00:30:57.399 00:25:43 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 3666050 ']' 00:30:57.399 00:25:43 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:57.399 00:25:43 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:57.400 00:25:43 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:57.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:57.400 00:25:43 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:57.400 00:25:43 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:57.400 [2024-07-16 00:25:44.053410] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:30:57.400 [2024-07-16 00:25:44.053486] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3666050 ] 00:30:57.400 [2024-07-16 00:25:44.180887] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:57.400 [2024-07-16 00:25:44.279798] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:57.400 [2024-07-16 00:25:44.279900] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:57.400 [2024-07-16 00:25:44.279901] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:58.330 [2024-07-16 00:25:45.033784] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:58.330 00:25:45 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:58.330 00:25:45 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:58.330 00:25:45 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:30:58.330 00:25:45 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:58.330 00:25:45 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:58.894 [2024-07-16 00:25:45.688194] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26ecf20 PMD being used: compress_qat 00:30:58.894 00:25:45 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:58.894 00:25:45 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:58.894 00:25:45 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:58.894 00:25:45 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:58.894 00:25:45 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:58.894 00:25:45 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:58.894 00:25:45 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:59.151 00:25:45 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:59.408 [ 00:30:59.408 { 00:30:59.408 "name": "Nvme0n1", 00:30:59.408 "aliases": [ 00:30:59.408 "01000000-0000-0000-5cd2-e43197705251" 00:30:59.408 ], 00:30:59.408 "product_name": "NVMe disk", 00:30:59.408 "block_size": 512, 00:30:59.408 "num_blocks": 15002931888, 00:30:59.408 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:59.408 "assigned_rate_limits": { 00:30:59.408 "rw_ios_per_sec": 0, 00:30:59.408 "rw_mbytes_per_sec": 0, 00:30:59.408 "r_mbytes_per_sec": 0, 00:30:59.408 "w_mbytes_per_sec": 0 00:30:59.408 }, 00:30:59.408 "claimed": false, 00:30:59.408 "zoned": false, 00:30:59.408 "supported_io_types": { 00:30:59.408 "read": true, 00:30:59.408 "write": true, 00:30:59.408 "unmap": true, 00:30:59.408 "flush": true, 00:30:59.408 "reset": true, 00:30:59.408 "nvme_admin": true, 00:30:59.408 "nvme_io": true, 00:30:59.408 "nvme_io_md": false, 00:30:59.408 "write_zeroes": true, 00:30:59.408 "zcopy": false, 00:30:59.408 "get_zone_info": false, 00:30:59.408 "zone_management": false, 00:30:59.408 "zone_append": false, 00:30:59.408 "compare": false, 00:30:59.408 "compare_and_write": false, 00:30:59.408 "abort": true, 00:30:59.408 "seek_hole": false, 00:30:59.408 "seek_data": false, 00:30:59.408 "copy": false, 00:30:59.408 "nvme_iov_md": false 00:30:59.408 }, 00:30:59.408 "driver_specific": { 00:30:59.408 "nvme": [ 00:30:59.408 { 00:30:59.408 "pci_address": "0000:5e:00.0", 00:30:59.408 "trid": { 00:30:59.408 "trtype": "PCIe", 00:30:59.408 "traddr": "0000:5e:00.0" 00:30:59.408 }, 00:30:59.408 "ctrlr_data": { 00:30:59.408 "cntlid": 0, 00:30:59.408 "vendor_id": "0x8086", 00:30:59.408 "model_number": "INTEL SSDPF2KX076TZO", 00:30:59.408 "serial_number": "PHAC0301002G7P6CGN", 00:30:59.408 "firmware_revision": "JCV10200", 00:30:59.408 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:59.408 "oacs": { 00:30:59.408 "security": 1, 00:30:59.408 "format": 1, 00:30:59.408 "firmware": 1, 00:30:59.408 "ns_manage": 1 00:30:59.408 }, 00:30:59.408 "multi_ctrlr": false, 00:30:59.408 "ana_reporting": false 00:30:59.408 }, 00:30:59.408 "vs": { 00:30:59.408 "nvme_version": "1.3" 00:30:59.408 }, 00:30:59.408 "ns_data": { 00:30:59.408 "id": 1, 00:30:59.408 "can_share": false 00:30:59.408 }, 00:30:59.408 "security": { 00:30:59.408 "opal": true 00:30:59.408 } 00:30:59.408 } 00:30:59.408 ], 00:30:59.408 "mp_policy": "active_passive" 00:30:59.408 } 00:30:59.408 } 00:30:59.408 ] 00:30:59.408 00:25:46 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:59.408 00:25:46 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:59.665 [2024-07-16 00:25:46.461886] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x253b390 PMD being used: compress_qat 00:31:02.188 cde1b2a6-0f8c-42a6-b568-c9b7e780fcd6 00:31:02.188 00:25:48 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:02.188 16507695-31f0-4200-9d89-ea3040c57a5b 00:31:02.188 00:25:48 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:02.188 00:25:48 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:02.188 00:25:48 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:02.188 00:25:48 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:02.188 00:25:48 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:02.188 00:25:48 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:02.188 00:25:48 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:02.445 00:25:49 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:02.703 [ 00:31:02.703 { 00:31:02.703 "name": "16507695-31f0-4200-9d89-ea3040c57a5b", 00:31:02.703 "aliases": [ 00:31:02.703 "lvs0/lv0" 00:31:02.703 ], 00:31:02.703 "product_name": "Logical Volume", 00:31:02.703 "block_size": 512, 00:31:02.703 "num_blocks": 204800, 00:31:02.703 "uuid": "16507695-31f0-4200-9d89-ea3040c57a5b", 00:31:02.703 "assigned_rate_limits": { 00:31:02.703 "rw_ios_per_sec": 0, 00:31:02.703 "rw_mbytes_per_sec": 0, 00:31:02.703 "r_mbytes_per_sec": 0, 00:31:02.703 "w_mbytes_per_sec": 0 00:31:02.703 }, 00:31:02.703 "claimed": false, 00:31:02.703 "zoned": false, 00:31:02.703 "supported_io_types": { 00:31:02.703 "read": true, 00:31:02.703 "write": true, 00:31:02.703 "unmap": true, 00:31:02.703 "flush": false, 00:31:02.703 "reset": true, 00:31:02.703 "nvme_admin": false, 00:31:02.703 "nvme_io": false, 00:31:02.703 "nvme_io_md": false, 00:31:02.703 "write_zeroes": true, 00:31:02.703 "zcopy": false, 00:31:02.703 "get_zone_info": false, 00:31:02.703 "zone_management": false, 00:31:02.703 "zone_append": false, 00:31:02.703 "compare": false, 00:31:02.703 "compare_and_write": false, 00:31:02.703 "abort": false, 00:31:02.703 "seek_hole": true, 00:31:02.703 "seek_data": true, 00:31:02.703 "copy": false, 00:31:02.703 "nvme_iov_md": false 00:31:02.703 }, 00:31:02.703 "driver_specific": { 00:31:02.703 "lvol": { 00:31:02.703 "lvol_store_uuid": "cde1b2a6-0f8c-42a6-b568-c9b7e780fcd6", 00:31:02.703 "base_bdev": "Nvme0n1", 00:31:02.703 "thin_provision": true, 00:31:02.703 "num_allocated_clusters": 0, 00:31:02.703 "snapshot": false, 00:31:02.703 "clone": false, 00:31:02.703 "esnap_clone": false 00:31:02.703 } 00:31:02.703 } 00:31:02.703 } 00:31:02.703 ] 00:31:02.703 00:25:49 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:02.703 00:25:49 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:02.703 00:25:49 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:02.960 [2024-07-16 00:25:49.690536] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:02.960 COMP_lvs0/lv0 00:31:02.960 00:25:49 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:02.960 00:25:49 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:02.960 00:25:49 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:02.960 00:25:49 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:02.960 00:25:49 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:02.960 00:25:49 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:02.960 00:25:49 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:03.224 00:25:49 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:03.482 [ 00:31:03.482 { 00:31:03.482 "name": "COMP_lvs0/lv0", 00:31:03.482 "aliases": [ 00:31:03.482 "b876a3e1-8855-5eb9-ba5d-37220cf46445" 00:31:03.482 ], 00:31:03.482 "product_name": "compress", 00:31:03.482 "block_size": 512, 00:31:03.482 "num_blocks": 200704, 00:31:03.482 "uuid": "b876a3e1-8855-5eb9-ba5d-37220cf46445", 00:31:03.482 "assigned_rate_limits": { 00:31:03.482 "rw_ios_per_sec": 0, 00:31:03.482 "rw_mbytes_per_sec": 0, 00:31:03.482 "r_mbytes_per_sec": 0, 00:31:03.482 "w_mbytes_per_sec": 0 00:31:03.482 }, 00:31:03.482 "claimed": false, 00:31:03.482 "zoned": false, 00:31:03.482 "supported_io_types": { 00:31:03.482 "read": true, 00:31:03.482 "write": true, 00:31:03.482 "unmap": false, 00:31:03.482 "flush": false, 00:31:03.482 "reset": false, 00:31:03.482 "nvme_admin": false, 00:31:03.482 "nvme_io": false, 00:31:03.482 "nvme_io_md": false, 00:31:03.482 "write_zeroes": true, 00:31:03.482 "zcopy": false, 00:31:03.482 "get_zone_info": false, 00:31:03.482 "zone_management": false, 00:31:03.482 "zone_append": false, 00:31:03.482 "compare": false, 00:31:03.482 "compare_and_write": false, 00:31:03.482 "abort": false, 00:31:03.482 "seek_hole": false, 00:31:03.482 "seek_data": false, 00:31:03.482 "copy": false, 00:31:03.482 "nvme_iov_md": false 00:31:03.482 }, 00:31:03.482 "driver_specific": { 00:31:03.482 "compress": { 00:31:03.482 "name": "COMP_lvs0/lv0", 00:31:03.482 "base_bdev_name": "16507695-31f0-4200-9d89-ea3040c57a5b" 00:31:03.482 } 00:31:03.482 } 00:31:03.482 } 00:31:03.482 ] 00:31:03.482 00:25:50 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:03.482 00:25:50 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:03.482 [2024-07-16 00:25:50.355478] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc1c01b1350 PMD being used: compress_qat 00:31:03.482 I/O targets: 00:31:03.482 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:03.482 00:31:03.483 00:31:03.483 CUnit - A unit testing framework for C - Version 2.1-3 00:31:03.483 http://cunit.sourceforge.net/ 00:31:03.483 00:31:03.483 00:31:03.483 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:03.483 Test: blockdev write read block ...passed 00:31:03.483 Test: blockdev write zeroes read block ...passed 00:31:03.483 Test: blockdev write zeroes read no split ...passed 00:31:03.483 Test: blockdev write zeroes read split ...passed 00:31:03.740 Test: blockdev write zeroes read split partial ...passed 00:31:03.740 Test: blockdev reset ...[2024-07-16 00:25:50.460138] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:03.740 passed 00:31:03.740 Test: blockdev write read 8 blocks ...passed 00:31:03.740 Test: blockdev write read size > 128k ...passed 00:31:03.740 Test: blockdev write read invalid size ...passed 00:31:03.740 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:03.740 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:03.740 Test: blockdev write read max offset ...passed 00:31:03.740 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:03.740 Test: blockdev writev readv 8 blocks ...passed 00:31:03.740 Test: blockdev writev readv 30 x 1block ...passed 00:31:03.740 Test: blockdev writev readv block ...passed 00:31:03.740 Test: blockdev writev readv size > 128k ...passed 00:31:03.740 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:03.740 Test: blockdev comparev and writev ...passed 00:31:03.740 Test: blockdev nvme passthru rw ...passed 00:31:03.740 Test: blockdev nvme passthru vendor specific ...passed 00:31:03.740 Test: blockdev nvme admin passthru ...passed 00:31:03.740 Test: blockdev copy ...passed 00:31:03.740 00:31:03.740 Run Summary: Type Total Ran Passed Failed Inactive 00:31:03.740 suites 1 1 n/a 0 0 00:31:03.740 tests 23 23 23 0 0 00:31:03.740 asserts 130 130 130 0 n/a 00:31:03.740 00:31:03.740 Elapsed time = 0.238 seconds 00:31:03.740 0 00:31:03.740 00:25:50 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:31:03.740 00:25:50 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:03.997 00:25:50 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:04.277 00:25:51 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:04.277 00:25:51 compress_compdev -- compress/compress.sh@62 -- # killprocess 3666050 00:31:04.277 00:25:51 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 3666050 ']' 00:31:04.277 00:25:51 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 3666050 00:31:04.277 00:25:51 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:04.277 00:25:51 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:04.277 00:25:51 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3666050 00:31:04.277 00:25:51 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:04.277 00:25:51 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:04.277 00:25:51 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3666050' 00:31:04.277 killing process with pid 3666050 00:31:04.277 00:25:51 compress_compdev -- common/autotest_common.sh@967 -- # kill 3666050 00:31:04.277 00:25:51 compress_compdev -- common/autotest_common.sh@972 -- # wait 3666050 00:31:07.565 00:25:54 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:07.565 00:25:54 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:07.565 00:31:07.565 real 0m50.059s 00:31:07.565 user 1m54.661s 00:31:07.565 sys 0m6.504s 00:31:07.565 00:25:54 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:07.565 00:25:54 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:07.565 ************************************ 00:31:07.565 END TEST compress_compdev 00:31:07.565 ************************************ 00:31:07.565 00:25:54 -- common/autotest_common.sh@1142 -- # return 0 00:31:07.565 00:25:54 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:07.565 00:25:54 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:07.565 00:25:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:07.565 00:25:54 -- common/autotest_common.sh@10 -- # set +x 00:31:07.565 ************************************ 00:31:07.565 START TEST compress_isal 00:31:07.565 ************************************ 00:31:07.565 00:25:54 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:07.565 * Looking for test storage... 00:31:07.565 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:07.565 00:25:54 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:07.565 00:25:54 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:07.565 00:25:54 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:07.566 00:25:54 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:07.566 00:25:54 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:07.566 00:25:54 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:07.566 00:25:54 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:07.566 00:25:54 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:07.566 00:25:54 compress_isal -- paths/export.sh@5 -- # export PATH 00:31:07.566 00:25:54 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:07.566 00:25:54 compress_isal -- nvmf/common.sh@47 -- # : 0 00:31:07.566 00:25:54 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:07.566 00:25:54 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:07.566 00:25:54 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:07.566 00:25:54 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:07.566 00:25:54 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:07.566 00:25:54 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:07.566 00:25:54 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:07.566 00:25:54 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:07.566 00:25:54 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:07.566 00:25:54 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:07.566 00:25:54 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:31:07.566 00:25:54 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:07.566 00:25:54 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:07.566 00:25:54 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=3667361 00:31:07.566 00:25:54 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:07.566 00:25:54 compress_isal -- compress/compress.sh@73 -- # waitforlisten 3667361 00:31:07.566 00:25:54 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 3667361 ']' 00:31:07.566 00:25:54 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:07.566 00:25:54 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:07.566 00:25:54 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:07.566 00:25:54 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:07.566 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:07.566 00:25:54 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:07.566 00:25:54 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:07.566 [2024-07-16 00:25:54.374428] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:31:07.566 [2024-07-16 00:25:54.374504] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3667361 ] 00:31:07.566 [2024-07-16 00:25:54.510626] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:07.824 [2024-07-16 00:25:54.635449] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:07.824 [2024-07-16 00:25:54.635458] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:08.389 00:25:55 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:08.389 00:25:55 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:08.389 00:25:55 compress_isal -- compress/compress.sh@74 -- # create_vols 00:31:08.389 00:25:55 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:08.389 00:25:55 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:08.954 00:25:55 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:08.954 00:25:55 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:08.954 00:25:55 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:08.954 00:25:55 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:08.954 00:25:55 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:08.954 00:25:55 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:08.954 00:25:55 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:09.212 00:25:56 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:09.470 [ 00:31:09.470 { 00:31:09.470 "name": "Nvme0n1", 00:31:09.470 "aliases": [ 00:31:09.470 "01000000-0000-0000-5cd2-e43197705251" 00:31:09.470 ], 00:31:09.470 "product_name": "NVMe disk", 00:31:09.470 "block_size": 512, 00:31:09.470 "num_blocks": 15002931888, 00:31:09.470 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:09.470 "assigned_rate_limits": { 00:31:09.470 "rw_ios_per_sec": 0, 00:31:09.470 "rw_mbytes_per_sec": 0, 00:31:09.470 "r_mbytes_per_sec": 0, 00:31:09.470 "w_mbytes_per_sec": 0 00:31:09.470 }, 00:31:09.470 "claimed": false, 00:31:09.470 "zoned": false, 00:31:09.470 "supported_io_types": { 00:31:09.470 "read": true, 00:31:09.470 "write": true, 00:31:09.470 "unmap": true, 00:31:09.470 "flush": true, 00:31:09.470 "reset": true, 00:31:09.470 "nvme_admin": true, 00:31:09.470 "nvme_io": true, 00:31:09.470 "nvme_io_md": false, 00:31:09.470 "write_zeroes": true, 00:31:09.470 "zcopy": false, 00:31:09.470 "get_zone_info": false, 00:31:09.470 "zone_management": false, 00:31:09.470 "zone_append": false, 00:31:09.470 "compare": false, 00:31:09.470 "compare_and_write": false, 00:31:09.470 "abort": true, 00:31:09.470 "seek_hole": false, 00:31:09.470 "seek_data": false, 00:31:09.470 "copy": false, 00:31:09.470 "nvme_iov_md": false 00:31:09.470 }, 00:31:09.470 "driver_specific": { 00:31:09.470 "nvme": [ 00:31:09.470 { 00:31:09.470 "pci_address": "0000:5e:00.0", 00:31:09.470 "trid": { 00:31:09.470 "trtype": "PCIe", 00:31:09.470 "traddr": "0000:5e:00.0" 00:31:09.470 }, 00:31:09.470 "ctrlr_data": { 00:31:09.470 "cntlid": 0, 00:31:09.470 "vendor_id": "0x8086", 00:31:09.470 "model_number": "INTEL SSDPF2KX076TZO", 00:31:09.470 "serial_number": "PHAC0301002G7P6CGN", 00:31:09.470 "firmware_revision": "JCV10200", 00:31:09.470 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:09.470 "oacs": { 00:31:09.470 "security": 1, 00:31:09.470 "format": 1, 00:31:09.470 "firmware": 1, 00:31:09.470 "ns_manage": 1 00:31:09.470 }, 00:31:09.470 "multi_ctrlr": false, 00:31:09.470 "ana_reporting": false 00:31:09.470 }, 00:31:09.470 "vs": { 00:31:09.470 "nvme_version": "1.3" 00:31:09.470 }, 00:31:09.470 "ns_data": { 00:31:09.470 "id": 1, 00:31:09.470 "can_share": false 00:31:09.470 }, 00:31:09.470 "security": { 00:31:09.470 "opal": true 00:31:09.470 } 00:31:09.470 } 00:31:09.470 ], 00:31:09.470 "mp_policy": "active_passive" 00:31:09.470 } 00:31:09.470 } 00:31:09.470 ] 00:31:09.470 00:25:56 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:09.470 00:25:56 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:11.995 edec9b32-6810-46eb-a1c4-2c91d31bb6e8 00:31:11.995 00:25:58 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:12.252 be1ebe7c-5068-4774-9b2a-7f15f0faec74 00:31:12.252 00:25:59 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:12.252 00:25:59 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:12.252 00:25:59 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:12.252 00:25:59 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:12.252 00:25:59 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:12.252 00:25:59 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:12.252 00:25:59 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:12.508 00:25:59 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:12.508 [ 00:31:12.508 { 00:31:12.508 "name": "be1ebe7c-5068-4774-9b2a-7f15f0faec74", 00:31:12.508 "aliases": [ 00:31:12.508 "lvs0/lv0" 00:31:12.508 ], 00:31:12.508 "product_name": "Logical Volume", 00:31:12.508 "block_size": 512, 00:31:12.508 "num_blocks": 204800, 00:31:12.508 "uuid": "be1ebe7c-5068-4774-9b2a-7f15f0faec74", 00:31:12.508 "assigned_rate_limits": { 00:31:12.508 "rw_ios_per_sec": 0, 00:31:12.508 "rw_mbytes_per_sec": 0, 00:31:12.508 "r_mbytes_per_sec": 0, 00:31:12.508 "w_mbytes_per_sec": 0 00:31:12.508 }, 00:31:12.508 "claimed": false, 00:31:12.508 "zoned": false, 00:31:12.508 "supported_io_types": { 00:31:12.508 "read": true, 00:31:12.508 "write": true, 00:31:12.508 "unmap": true, 00:31:12.508 "flush": false, 00:31:12.508 "reset": true, 00:31:12.508 "nvme_admin": false, 00:31:12.508 "nvme_io": false, 00:31:12.508 "nvme_io_md": false, 00:31:12.508 "write_zeroes": true, 00:31:12.508 "zcopy": false, 00:31:12.508 "get_zone_info": false, 00:31:12.508 "zone_management": false, 00:31:12.508 "zone_append": false, 00:31:12.508 "compare": false, 00:31:12.508 "compare_and_write": false, 00:31:12.508 "abort": false, 00:31:12.508 "seek_hole": true, 00:31:12.508 "seek_data": true, 00:31:12.508 "copy": false, 00:31:12.508 "nvme_iov_md": false 00:31:12.508 }, 00:31:12.508 "driver_specific": { 00:31:12.508 "lvol": { 00:31:12.508 "lvol_store_uuid": "edec9b32-6810-46eb-a1c4-2c91d31bb6e8", 00:31:12.508 "base_bdev": "Nvme0n1", 00:31:12.508 "thin_provision": true, 00:31:12.508 "num_allocated_clusters": 0, 00:31:12.508 "snapshot": false, 00:31:12.508 "clone": false, 00:31:12.508 "esnap_clone": false 00:31:12.508 } 00:31:12.508 } 00:31:12.508 } 00:31:12.508 ] 00:31:12.765 00:25:59 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:12.765 00:25:59 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:12.765 00:25:59 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:12.765 [2024-07-16 00:25:59.648562] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:12.765 COMP_lvs0/lv0 00:31:12.765 00:25:59 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:12.765 00:25:59 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:12.766 00:25:59 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:12.766 00:25:59 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:12.766 00:25:59 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:12.766 00:25:59 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:12.766 00:25:59 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:13.022 00:25:59 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:13.280 [ 00:31:13.280 { 00:31:13.280 "name": "COMP_lvs0/lv0", 00:31:13.280 "aliases": [ 00:31:13.280 "db4b3aa6-e3e9-5c1a-827e-a139170f3835" 00:31:13.280 ], 00:31:13.280 "product_name": "compress", 00:31:13.280 "block_size": 512, 00:31:13.280 "num_blocks": 200704, 00:31:13.280 "uuid": "db4b3aa6-e3e9-5c1a-827e-a139170f3835", 00:31:13.280 "assigned_rate_limits": { 00:31:13.280 "rw_ios_per_sec": 0, 00:31:13.280 "rw_mbytes_per_sec": 0, 00:31:13.280 "r_mbytes_per_sec": 0, 00:31:13.280 "w_mbytes_per_sec": 0 00:31:13.280 }, 00:31:13.280 "claimed": false, 00:31:13.280 "zoned": false, 00:31:13.280 "supported_io_types": { 00:31:13.280 "read": true, 00:31:13.280 "write": true, 00:31:13.280 "unmap": false, 00:31:13.280 "flush": false, 00:31:13.280 "reset": false, 00:31:13.280 "nvme_admin": false, 00:31:13.280 "nvme_io": false, 00:31:13.280 "nvme_io_md": false, 00:31:13.280 "write_zeroes": true, 00:31:13.280 "zcopy": false, 00:31:13.280 "get_zone_info": false, 00:31:13.280 "zone_management": false, 00:31:13.280 "zone_append": false, 00:31:13.280 "compare": false, 00:31:13.280 "compare_and_write": false, 00:31:13.280 "abort": false, 00:31:13.280 "seek_hole": false, 00:31:13.280 "seek_data": false, 00:31:13.280 "copy": false, 00:31:13.280 "nvme_iov_md": false 00:31:13.280 }, 00:31:13.280 "driver_specific": { 00:31:13.280 "compress": { 00:31:13.280 "name": "COMP_lvs0/lv0", 00:31:13.280 "base_bdev_name": "be1ebe7c-5068-4774-9b2a-7f15f0faec74" 00:31:13.280 } 00:31:13.280 } 00:31:13.280 } 00:31:13.280 ] 00:31:13.280 00:26:00 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:13.280 00:26:00 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:13.280 Running I/O for 3 seconds... 00:31:16.562 00:31:16.562 Latency(us) 00:31:16.562 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:16.562 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:16.562 Verification LBA range: start 0x0 length 0x3100 00:31:16.562 COMP_lvs0/lv0 : 3.02 1285.83 5.02 0.00 0.00 24756.98 2407.74 21883.33 00:31:16.562 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:16.562 Verification LBA range: start 0x3100 length 0x3100 00:31:16.562 COMP_lvs0/lv0 : 3.01 1288.79 5.03 0.00 0.00 24688.75 1488.81 20287.67 00:31:16.562 =================================================================================================================== 00:31:16.562 Total : 2574.62 10.06 0.00 0.00 24722.84 1488.81 21883.33 00:31:16.562 0 00:31:16.562 00:26:03 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:16.563 00:26:03 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:16.563 00:26:03 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:16.823 00:26:03 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:16.823 00:26:03 compress_isal -- compress/compress.sh@78 -- # killprocess 3667361 00:31:16.823 00:26:03 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 3667361 ']' 00:31:16.823 00:26:03 compress_isal -- common/autotest_common.sh@952 -- # kill -0 3667361 00:31:16.823 00:26:03 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:16.823 00:26:03 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:16.823 00:26:03 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3667361 00:31:16.823 00:26:03 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:16.823 00:26:03 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:16.823 00:26:03 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3667361' 00:31:16.823 killing process with pid 3667361 00:31:16.823 00:26:03 compress_isal -- common/autotest_common.sh@967 -- # kill 3667361 00:31:16.823 Received shutdown signal, test time was about 3.000000 seconds 00:31:16.823 00:31:16.823 Latency(us) 00:31:16.823 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:16.823 =================================================================================================================== 00:31:16.823 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:16.823 00:26:03 compress_isal -- common/autotest_common.sh@972 -- # wait 3667361 00:31:20.102 00:26:06 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:20.102 00:26:06 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:20.102 00:26:06 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=3668957 00:31:20.102 00:26:06 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:20.102 00:26:06 compress_isal -- compress/compress.sh@73 -- # waitforlisten 3668957 00:31:20.102 00:26:06 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 3668957 ']' 00:31:20.102 00:26:06 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:20.102 00:26:06 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:20.102 00:26:06 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:20.102 00:26:06 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:20.102 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:20.102 00:26:06 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:20.102 00:26:06 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:20.102 [2024-07-16 00:26:06.782716] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:31:20.102 [2024-07-16 00:26:06.782771] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3668957 ] 00:31:20.102 [2024-07-16 00:26:06.901324] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:20.102 [2024-07-16 00:26:07.016800] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:20.102 [2024-07-16 00:26:07.016805] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:21.035 00:26:07 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:21.035 00:26:07 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:21.035 00:26:07 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:31:21.035 00:26:07 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:21.035 00:26:07 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:21.600 00:26:08 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:21.600 00:26:08 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:21.600 00:26:08 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:21.600 00:26:08 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:21.600 00:26:08 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:21.600 00:26:08 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:21.600 00:26:08 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:21.600 00:26:08 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:21.859 [ 00:31:21.859 { 00:31:21.859 "name": "Nvme0n1", 00:31:21.859 "aliases": [ 00:31:21.859 "01000000-0000-0000-5cd2-e43197705251" 00:31:21.859 ], 00:31:21.859 "product_name": "NVMe disk", 00:31:21.859 "block_size": 512, 00:31:21.859 "num_blocks": 15002931888, 00:31:21.859 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:21.859 "assigned_rate_limits": { 00:31:21.859 "rw_ios_per_sec": 0, 00:31:21.859 "rw_mbytes_per_sec": 0, 00:31:21.859 "r_mbytes_per_sec": 0, 00:31:21.859 "w_mbytes_per_sec": 0 00:31:21.859 }, 00:31:21.859 "claimed": false, 00:31:21.859 "zoned": false, 00:31:21.859 "supported_io_types": { 00:31:21.859 "read": true, 00:31:21.859 "write": true, 00:31:21.859 "unmap": true, 00:31:21.859 "flush": true, 00:31:21.859 "reset": true, 00:31:21.859 "nvme_admin": true, 00:31:21.859 "nvme_io": true, 00:31:21.859 "nvme_io_md": false, 00:31:21.859 "write_zeroes": true, 00:31:21.859 "zcopy": false, 00:31:21.859 "get_zone_info": false, 00:31:21.859 "zone_management": false, 00:31:21.859 "zone_append": false, 00:31:21.859 "compare": false, 00:31:21.859 "compare_and_write": false, 00:31:21.859 "abort": true, 00:31:21.859 "seek_hole": false, 00:31:21.859 "seek_data": false, 00:31:21.859 "copy": false, 00:31:21.859 "nvme_iov_md": false 00:31:21.859 }, 00:31:21.859 "driver_specific": { 00:31:21.859 "nvme": [ 00:31:21.859 { 00:31:21.859 "pci_address": "0000:5e:00.0", 00:31:21.859 "trid": { 00:31:21.859 "trtype": "PCIe", 00:31:21.859 "traddr": "0000:5e:00.0" 00:31:21.859 }, 00:31:21.859 "ctrlr_data": { 00:31:21.859 "cntlid": 0, 00:31:21.859 "vendor_id": "0x8086", 00:31:21.859 "model_number": "INTEL SSDPF2KX076TZO", 00:31:21.859 "serial_number": "PHAC0301002G7P6CGN", 00:31:21.859 "firmware_revision": "JCV10200", 00:31:21.859 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:21.859 "oacs": { 00:31:21.859 "security": 1, 00:31:21.859 "format": 1, 00:31:21.859 "firmware": 1, 00:31:21.859 "ns_manage": 1 00:31:21.859 }, 00:31:21.859 "multi_ctrlr": false, 00:31:21.859 "ana_reporting": false 00:31:21.859 }, 00:31:21.859 "vs": { 00:31:21.859 "nvme_version": "1.3" 00:31:21.859 }, 00:31:21.859 "ns_data": { 00:31:21.859 "id": 1, 00:31:21.859 "can_share": false 00:31:21.859 }, 00:31:21.859 "security": { 00:31:21.859 "opal": true 00:31:21.859 } 00:31:21.859 } 00:31:21.859 ], 00:31:21.859 "mp_policy": "active_passive" 00:31:21.859 } 00:31:21.859 } 00:31:21.859 ] 00:31:21.859 00:26:08 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:21.859 00:26:08 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:24.386 9c29d7fa-53b8-43fa-9ff9-b3bb97eb3dd3 00:31:24.386 00:26:11 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:24.386 affbf9dd-291e-455e-92cb-b10d17b945e2 00:31:24.386 00:26:11 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:24.386 00:26:11 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:24.386 00:26:11 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:24.386 00:26:11 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:24.386 00:26:11 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:24.386 00:26:11 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:24.386 00:26:11 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:24.644 00:26:11 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:24.970 [ 00:31:24.970 { 00:31:24.970 "name": "affbf9dd-291e-455e-92cb-b10d17b945e2", 00:31:24.970 "aliases": [ 00:31:24.970 "lvs0/lv0" 00:31:24.970 ], 00:31:24.970 "product_name": "Logical Volume", 00:31:24.970 "block_size": 512, 00:31:24.970 "num_blocks": 204800, 00:31:24.970 "uuid": "affbf9dd-291e-455e-92cb-b10d17b945e2", 00:31:24.970 "assigned_rate_limits": { 00:31:24.970 "rw_ios_per_sec": 0, 00:31:24.970 "rw_mbytes_per_sec": 0, 00:31:24.970 "r_mbytes_per_sec": 0, 00:31:24.970 "w_mbytes_per_sec": 0 00:31:24.970 }, 00:31:24.970 "claimed": false, 00:31:24.970 "zoned": false, 00:31:24.970 "supported_io_types": { 00:31:24.970 "read": true, 00:31:24.970 "write": true, 00:31:24.970 "unmap": true, 00:31:24.970 "flush": false, 00:31:24.970 "reset": true, 00:31:24.970 "nvme_admin": false, 00:31:24.970 "nvme_io": false, 00:31:24.970 "nvme_io_md": false, 00:31:24.970 "write_zeroes": true, 00:31:24.970 "zcopy": false, 00:31:24.970 "get_zone_info": false, 00:31:24.970 "zone_management": false, 00:31:24.970 "zone_append": false, 00:31:24.970 "compare": false, 00:31:24.970 "compare_and_write": false, 00:31:24.970 "abort": false, 00:31:24.970 "seek_hole": true, 00:31:24.970 "seek_data": true, 00:31:24.970 "copy": false, 00:31:24.970 "nvme_iov_md": false 00:31:24.970 }, 00:31:24.970 "driver_specific": { 00:31:24.970 "lvol": { 00:31:24.970 "lvol_store_uuid": "9c29d7fa-53b8-43fa-9ff9-b3bb97eb3dd3", 00:31:24.970 "base_bdev": "Nvme0n1", 00:31:24.970 "thin_provision": true, 00:31:24.970 "num_allocated_clusters": 0, 00:31:24.970 "snapshot": false, 00:31:24.970 "clone": false, 00:31:24.970 "esnap_clone": false 00:31:24.970 } 00:31:24.970 } 00:31:24.970 } 00:31:24.970 ] 00:31:24.970 00:26:11 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:24.970 00:26:11 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:24.970 00:26:11 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:25.228 [2024-07-16 00:26:11.943278] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:25.228 COMP_lvs0/lv0 00:31:25.228 00:26:11 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:25.228 00:26:11 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:25.228 00:26:11 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:25.228 00:26:11 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:25.228 00:26:11 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:25.228 00:26:11 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:25.228 00:26:11 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:25.486 00:26:12 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:25.745 [ 00:31:25.745 { 00:31:25.745 "name": "COMP_lvs0/lv0", 00:31:25.745 "aliases": [ 00:31:25.745 "54ee328e-a6e9-59cd-b636-e4abb1edca24" 00:31:25.745 ], 00:31:25.745 "product_name": "compress", 00:31:25.745 "block_size": 512, 00:31:25.745 "num_blocks": 200704, 00:31:25.745 "uuid": "54ee328e-a6e9-59cd-b636-e4abb1edca24", 00:31:25.745 "assigned_rate_limits": { 00:31:25.745 "rw_ios_per_sec": 0, 00:31:25.745 "rw_mbytes_per_sec": 0, 00:31:25.745 "r_mbytes_per_sec": 0, 00:31:25.745 "w_mbytes_per_sec": 0 00:31:25.745 }, 00:31:25.745 "claimed": false, 00:31:25.745 "zoned": false, 00:31:25.745 "supported_io_types": { 00:31:25.745 "read": true, 00:31:25.745 "write": true, 00:31:25.745 "unmap": false, 00:31:25.745 "flush": false, 00:31:25.745 "reset": false, 00:31:25.745 "nvme_admin": false, 00:31:25.745 "nvme_io": false, 00:31:25.745 "nvme_io_md": false, 00:31:25.745 "write_zeroes": true, 00:31:25.745 "zcopy": false, 00:31:25.745 "get_zone_info": false, 00:31:25.745 "zone_management": false, 00:31:25.745 "zone_append": false, 00:31:25.745 "compare": false, 00:31:25.745 "compare_and_write": false, 00:31:25.745 "abort": false, 00:31:25.745 "seek_hole": false, 00:31:25.745 "seek_data": false, 00:31:25.745 "copy": false, 00:31:25.745 "nvme_iov_md": false 00:31:25.745 }, 00:31:25.745 "driver_specific": { 00:31:25.745 "compress": { 00:31:25.745 "name": "COMP_lvs0/lv0", 00:31:25.745 "base_bdev_name": "affbf9dd-291e-455e-92cb-b10d17b945e2" 00:31:25.745 } 00:31:25.745 } 00:31:25.745 } 00:31:25.745 ] 00:31:25.745 00:26:12 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:25.745 00:26:12 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:25.745 Running I/O for 3 seconds... 00:31:29.027 00:31:29.027 Latency(us) 00:31:29.027 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:29.027 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:29.027 Verification LBA range: start 0x0 length 0x3100 00:31:29.027 COMP_lvs0/lv0 : 3.01 1268.59 4.96 0.00 0.00 25110.09 2208.28 21769.35 00:31:29.027 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:29.027 Verification LBA range: start 0x3100 length 0x3100 00:31:29.027 COMP_lvs0/lv0 : 3.01 1271.17 4.97 0.00 0.00 25037.42 1517.30 20971.52 00:31:29.027 =================================================================================================================== 00:31:29.027 Total : 2539.76 9.92 0.00 0.00 25073.72 1517.30 21769.35 00:31:29.027 0 00:31:29.027 00:26:15 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:29.027 00:26:15 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:29.027 00:26:15 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:29.284 00:26:16 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:29.284 00:26:16 compress_isal -- compress/compress.sh@78 -- # killprocess 3668957 00:31:29.284 00:26:16 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 3668957 ']' 00:31:29.284 00:26:16 compress_isal -- common/autotest_common.sh@952 -- # kill -0 3668957 00:31:29.284 00:26:16 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:29.284 00:26:16 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:29.284 00:26:16 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3668957 00:31:29.284 00:26:16 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:29.284 00:26:16 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:29.284 00:26:16 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3668957' 00:31:29.284 killing process with pid 3668957 00:31:29.284 00:26:16 compress_isal -- common/autotest_common.sh@967 -- # kill 3668957 00:31:29.284 Received shutdown signal, test time was about 3.000000 seconds 00:31:29.284 00:31:29.284 Latency(us) 00:31:29.284 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:29.284 =================================================================================================================== 00:31:29.284 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:29.284 00:26:16 compress_isal -- common/autotest_common.sh@972 -- # wait 3668957 00:31:32.558 00:26:19 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:31:32.558 00:26:19 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:32.558 00:26:19 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=3670561 00:31:32.558 00:26:19 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:32.558 00:26:19 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:32.558 00:26:19 compress_isal -- compress/compress.sh@73 -- # waitforlisten 3670561 00:31:32.558 00:26:19 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 3670561 ']' 00:31:32.558 00:26:19 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:32.558 00:26:19 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:32.558 00:26:19 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:32.558 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:32.558 00:26:19 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:32.558 00:26:19 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:32.558 [2024-07-16 00:26:19.299387] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:31:32.558 [2024-07-16 00:26:19.299462] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3670561 ] 00:31:32.558 [2024-07-16 00:26:19.435119] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:32.816 [2024-07-16 00:26:19.556826] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:32.816 [2024-07-16 00:26:19.556831] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:33.746 00:26:20 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:33.746 00:26:20 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:33.746 00:26:20 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:31:33.746 00:26:20 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:33.746 00:26:20 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:34.309 00:26:21 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:34.309 00:26:21 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:34.309 00:26:21 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:34.309 00:26:21 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:34.309 00:26:21 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:34.309 00:26:21 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:34.309 00:26:21 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:34.309 00:26:21 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:34.567 [ 00:31:34.567 { 00:31:34.567 "name": "Nvme0n1", 00:31:34.567 "aliases": [ 00:31:34.567 "01000000-0000-0000-5cd2-e43197705251" 00:31:34.567 ], 00:31:34.567 "product_name": "NVMe disk", 00:31:34.567 "block_size": 512, 00:31:34.567 "num_blocks": 15002931888, 00:31:34.567 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:34.567 "assigned_rate_limits": { 00:31:34.567 "rw_ios_per_sec": 0, 00:31:34.567 "rw_mbytes_per_sec": 0, 00:31:34.567 "r_mbytes_per_sec": 0, 00:31:34.567 "w_mbytes_per_sec": 0 00:31:34.567 }, 00:31:34.567 "claimed": false, 00:31:34.567 "zoned": false, 00:31:34.567 "supported_io_types": { 00:31:34.567 "read": true, 00:31:34.567 "write": true, 00:31:34.567 "unmap": true, 00:31:34.567 "flush": true, 00:31:34.567 "reset": true, 00:31:34.567 "nvme_admin": true, 00:31:34.567 "nvme_io": true, 00:31:34.567 "nvme_io_md": false, 00:31:34.567 "write_zeroes": true, 00:31:34.567 "zcopy": false, 00:31:34.567 "get_zone_info": false, 00:31:34.567 "zone_management": false, 00:31:34.567 "zone_append": false, 00:31:34.567 "compare": false, 00:31:34.567 "compare_and_write": false, 00:31:34.567 "abort": true, 00:31:34.567 "seek_hole": false, 00:31:34.567 "seek_data": false, 00:31:34.567 "copy": false, 00:31:34.567 "nvme_iov_md": false 00:31:34.567 }, 00:31:34.567 "driver_specific": { 00:31:34.567 "nvme": [ 00:31:34.567 { 00:31:34.567 "pci_address": "0000:5e:00.0", 00:31:34.567 "trid": { 00:31:34.567 "trtype": "PCIe", 00:31:34.567 "traddr": "0000:5e:00.0" 00:31:34.568 }, 00:31:34.568 "ctrlr_data": { 00:31:34.568 "cntlid": 0, 00:31:34.568 "vendor_id": "0x8086", 00:31:34.568 "model_number": "INTEL SSDPF2KX076TZO", 00:31:34.568 "serial_number": "PHAC0301002G7P6CGN", 00:31:34.568 "firmware_revision": "JCV10200", 00:31:34.568 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:34.568 "oacs": { 00:31:34.568 "security": 1, 00:31:34.568 "format": 1, 00:31:34.568 "firmware": 1, 00:31:34.568 "ns_manage": 1 00:31:34.568 }, 00:31:34.568 "multi_ctrlr": false, 00:31:34.568 "ana_reporting": false 00:31:34.568 }, 00:31:34.568 "vs": { 00:31:34.568 "nvme_version": "1.3" 00:31:34.568 }, 00:31:34.568 "ns_data": { 00:31:34.568 "id": 1, 00:31:34.568 "can_share": false 00:31:34.568 }, 00:31:34.568 "security": { 00:31:34.568 "opal": true 00:31:34.568 } 00:31:34.568 } 00:31:34.568 ], 00:31:34.568 "mp_policy": "active_passive" 00:31:34.568 } 00:31:34.568 } 00:31:34.568 ] 00:31:34.568 00:26:21 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:34.568 00:26:21 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:37.095 4517df19-14ae-4ab9-8e09-b189cb79c65c 00:31:37.095 00:26:23 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:37.351 996084c0-b470-4aa2-a40c-57cce862a71b 00:31:37.351 00:26:24 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:37.351 00:26:24 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:37.351 00:26:24 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:37.351 00:26:24 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:37.351 00:26:24 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:37.351 00:26:24 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:37.351 00:26:24 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:37.608 00:26:24 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:37.876 [ 00:31:37.876 { 00:31:37.876 "name": "996084c0-b470-4aa2-a40c-57cce862a71b", 00:31:37.876 "aliases": [ 00:31:37.876 "lvs0/lv0" 00:31:37.876 ], 00:31:37.876 "product_name": "Logical Volume", 00:31:37.876 "block_size": 512, 00:31:37.876 "num_blocks": 204800, 00:31:37.876 "uuid": "996084c0-b470-4aa2-a40c-57cce862a71b", 00:31:37.876 "assigned_rate_limits": { 00:31:37.876 "rw_ios_per_sec": 0, 00:31:37.876 "rw_mbytes_per_sec": 0, 00:31:37.876 "r_mbytes_per_sec": 0, 00:31:37.876 "w_mbytes_per_sec": 0 00:31:37.876 }, 00:31:37.876 "claimed": false, 00:31:37.876 "zoned": false, 00:31:37.876 "supported_io_types": { 00:31:37.876 "read": true, 00:31:37.876 "write": true, 00:31:37.876 "unmap": true, 00:31:37.876 "flush": false, 00:31:37.876 "reset": true, 00:31:37.876 "nvme_admin": false, 00:31:37.876 "nvme_io": false, 00:31:37.876 "nvme_io_md": false, 00:31:37.876 "write_zeroes": true, 00:31:37.876 "zcopy": false, 00:31:37.876 "get_zone_info": false, 00:31:37.876 "zone_management": false, 00:31:37.876 "zone_append": false, 00:31:37.876 "compare": false, 00:31:37.876 "compare_and_write": false, 00:31:37.876 "abort": false, 00:31:37.876 "seek_hole": true, 00:31:37.876 "seek_data": true, 00:31:37.876 "copy": false, 00:31:37.876 "nvme_iov_md": false 00:31:37.876 }, 00:31:37.876 "driver_specific": { 00:31:37.876 "lvol": { 00:31:37.876 "lvol_store_uuid": "4517df19-14ae-4ab9-8e09-b189cb79c65c", 00:31:37.876 "base_bdev": "Nvme0n1", 00:31:37.876 "thin_provision": true, 00:31:37.876 "num_allocated_clusters": 0, 00:31:37.876 "snapshot": false, 00:31:37.876 "clone": false, 00:31:37.876 "esnap_clone": false 00:31:37.876 } 00:31:37.876 } 00:31:37.876 } 00:31:37.876 ] 00:31:37.876 00:26:24 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:37.876 00:26:24 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:31:37.876 00:26:24 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:31:38.133 [2024-07-16 00:26:24.846094] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:38.133 COMP_lvs0/lv0 00:31:38.133 00:26:24 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:38.133 00:26:24 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:38.133 00:26:24 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:38.133 00:26:24 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:38.133 00:26:24 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:38.133 00:26:24 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:38.133 00:26:24 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:38.390 00:26:25 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:38.391 [ 00:31:38.391 { 00:31:38.391 "name": "COMP_lvs0/lv0", 00:31:38.391 "aliases": [ 00:31:38.391 "3597117f-b245-5999-b993-d4efb91681f1" 00:31:38.391 ], 00:31:38.391 "product_name": "compress", 00:31:38.391 "block_size": 4096, 00:31:38.391 "num_blocks": 25088, 00:31:38.391 "uuid": "3597117f-b245-5999-b993-d4efb91681f1", 00:31:38.391 "assigned_rate_limits": { 00:31:38.391 "rw_ios_per_sec": 0, 00:31:38.391 "rw_mbytes_per_sec": 0, 00:31:38.391 "r_mbytes_per_sec": 0, 00:31:38.391 "w_mbytes_per_sec": 0 00:31:38.391 }, 00:31:38.391 "claimed": false, 00:31:38.391 "zoned": false, 00:31:38.391 "supported_io_types": { 00:31:38.391 "read": true, 00:31:38.391 "write": true, 00:31:38.391 "unmap": false, 00:31:38.391 "flush": false, 00:31:38.391 "reset": false, 00:31:38.391 "nvme_admin": false, 00:31:38.391 "nvme_io": false, 00:31:38.391 "nvme_io_md": false, 00:31:38.391 "write_zeroes": true, 00:31:38.391 "zcopy": false, 00:31:38.391 "get_zone_info": false, 00:31:38.391 "zone_management": false, 00:31:38.391 "zone_append": false, 00:31:38.391 "compare": false, 00:31:38.391 "compare_and_write": false, 00:31:38.391 "abort": false, 00:31:38.391 "seek_hole": false, 00:31:38.391 "seek_data": false, 00:31:38.391 "copy": false, 00:31:38.391 "nvme_iov_md": false 00:31:38.391 }, 00:31:38.391 "driver_specific": { 00:31:38.391 "compress": { 00:31:38.391 "name": "COMP_lvs0/lv0", 00:31:38.391 "base_bdev_name": "996084c0-b470-4aa2-a40c-57cce862a71b" 00:31:38.391 } 00:31:38.391 } 00:31:38.391 } 00:31:38.391 ] 00:31:38.391 00:26:25 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:38.391 00:26:25 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:38.650 Running I/O for 3 seconds... 00:31:41.935 00:31:41.935 Latency(us) 00:31:41.935 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:41.935 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:41.935 Verification LBA range: start 0x0 length 0x3100 00:31:41.935 COMP_lvs0/lv0 : 3.01 1280.27 5.00 0.00 0.00 24881.11 2293.76 21427.42 00:31:41.935 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:41.935 Verification LBA range: start 0x3100 length 0x3100 00:31:41.935 COMP_lvs0/lv0 : 3.01 1282.87 5.01 0.00 0.00 24807.90 1495.93 20971.52 00:31:41.935 =================================================================================================================== 00:31:41.935 Total : 2563.13 10.01 0.00 0.00 24844.47 1495.93 21427.42 00:31:41.935 0 00:31:41.935 00:26:28 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:41.935 00:26:28 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:41.935 00:26:28 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:42.193 00:26:29 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:42.193 00:26:29 compress_isal -- compress/compress.sh@78 -- # killprocess 3670561 00:31:42.193 00:26:29 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 3670561 ']' 00:31:42.193 00:26:29 compress_isal -- common/autotest_common.sh@952 -- # kill -0 3670561 00:31:42.193 00:26:29 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:42.193 00:26:29 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:42.193 00:26:29 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3670561 00:31:42.193 00:26:29 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:42.193 00:26:29 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:42.193 00:26:29 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3670561' 00:31:42.193 killing process with pid 3670561 00:31:42.193 00:26:29 compress_isal -- common/autotest_common.sh@967 -- # kill 3670561 00:31:42.193 Received shutdown signal, test time was about 3.000000 seconds 00:31:42.193 00:31:42.193 Latency(us) 00:31:42.193 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:42.193 =================================================================================================================== 00:31:42.193 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:42.193 00:26:29 compress_isal -- common/autotest_common.sh@972 -- # wait 3670561 00:31:45.474 00:26:32 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:31:45.474 00:26:32 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:45.474 00:26:32 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=3672160 00:31:45.474 00:26:32 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:45.474 00:26:32 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:31:45.474 00:26:32 compress_isal -- compress/compress.sh@57 -- # waitforlisten 3672160 00:31:45.474 00:26:32 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 3672160 ']' 00:31:45.474 00:26:32 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:45.474 00:26:32 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:45.474 00:26:32 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:45.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:45.474 00:26:32 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:45.474 00:26:32 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:45.474 [2024-07-16 00:26:32.173730] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:31:45.474 [2024-07-16 00:26:32.173801] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3672160 ] 00:31:45.474 [2024-07-16 00:26:32.360363] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:45.774 [2024-07-16 00:26:32.470009] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:45.774 [2024-07-16 00:26:32.470107] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:45.774 [2024-07-16 00:26:32.470109] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:46.338 00:26:33 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:46.338 00:26:33 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:46.339 00:26:33 compress_isal -- compress/compress.sh@58 -- # create_vols 00:31:46.339 00:26:33 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:46.339 00:26:33 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:46.904 00:26:33 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:46.904 00:26:33 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:46.904 00:26:33 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:46.904 00:26:33 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:46.904 00:26:33 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:46.904 00:26:33 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:46.904 00:26:33 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:47.165 00:26:33 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:47.423 [ 00:31:47.423 { 00:31:47.423 "name": "Nvme0n1", 00:31:47.423 "aliases": [ 00:31:47.423 "01000000-0000-0000-5cd2-e43197705251" 00:31:47.423 ], 00:31:47.423 "product_name": "NVMe disk", 00:31:47.423 "block_size": 512, 00:31:47.423 "num_blocks": 15002931888, 00:31:47.423 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:47.423 "assigned_rate_limits": { 00:31:47.423 "rw_ios_per_sec": 0, 00:31:47.423 "rw_mbytes_per_sec": 0, 00:31:47.423 "r_mbytes_per_sec": 0, 00:31:47.423 "w_mbytes_per_sec": 0 00:31:47.423 }, 00:31:47.423 "claimed": false, 00:31:47.423 "zoned": false, 00:31:47.423 "supported_io_types": { 00:31:47.423 "read": true, 00:31:47.423 "write": true, 00:31:47.423 "unmap": true, 00:31:47.423 "flush": true, 00:31:47.423 "reset": true, 00:31:47.423 "nvme_admin": true, 00:31:47.423 "nvme_io": true, 00:31:47.423 "nvme_io_md": false, 00:31:47.423 "write_zeroes": true, 00:31:47.423 "zcopy": false, 00:31:47.423 "get_zone_info": false, 00:31:47.423 "zone_management": false, 00:31:47.423 "zone_append": false, 00:31:47.423 "compare": false, 00:31:47.423 "compare_and_write": false, 00:31:47.423 "abort": true, 00:31:47.423 "seek_hole": false, 00:31:47.423 "seek_data": false, 00:31:47.423 "copy": false, 00:31:47.423 "nvme_iov_md": false 00:31:47.423 }, 00:31:47.423 "driver_specific": { 00:31:47.423 "nvme": [ 00:31:47.423 { 00:31:47.423 "pci_address": "0000:5e:00.0", 00:31:47.423 "trid": { 00:31:47.423 "trtype": "PCIe", 00:31:47.423 "traddr": "0000:5e:00.0" 00:31:47.423 }, 00:31:47.423 "ctrlr_data": { 00:31:47.423 "cntlid": 0, 00:31:47.423 "vendor_id": "0x8086", 00:31:47.423 "model_number": "INTEL SSDPF2KX076TZO", 00:31:47.423 "serial_number": "PHAC0301002G7P6CGN", 00:31:47.423 "firmware_revision": "JCV10200", 00:31:47.423 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:47.423 "oacs": { 00:31:47.423 "security": 1, 00:31:47.423 "format": 1, 00:31:47.423 "firmware": 1, 00:31:47.423 "ns_manage": 1 00:31:47.423 }, 00:31:47.423 "multi_ctrlr": false, 00:31:47.423 "ana_reporting": false 00:31:47.423 }, 00:31:47.423 "vs": { 00:31:47.423 "nvme_version": "1.3" 00:31:47.423 }, 00:31:47.423 "ns_data": { 00:31:47.423 "id": 1, 00:31:47.423 "can_share": false 00:31:47.423 }, 00:31:47.423 "security": { 00:31:47.423 "opal": true 00:31:47.423 } 00:31:47.423 } 00:31:47.423 ], 00:31:47.423 "mp_policy": "active_passive" 00:31:47.423 } 00:31:47.423 } 00:31:47.423 ] 00:31:47.423 00:26:34 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:47.423 00:26:34 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:49.957 5ac35e81-0595-49c9-a1b0-0fba797fc6a0 00:31:49.957 00:26:36 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:50.214 aa33c6cb-d0e3-48d1-8c66-c7380e068fa4 00:31:50.214 00:26:36 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:50.214 00:26:36 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:50.214 00:26:36 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:50.214 00:26:36 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:50.214 00:26:36 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:50.214 00:26:36 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:50.214 00:26:36 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:50.472 00:26:37 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:50.729 [ 00:31:50.729 { 00:31:50.729 "name": "aa33c6cb-d0e3-48d1-8c66-c7380e068fa4", 00:31:50.729 "aliases": [ 00:31:50.729 "lvs0/lv0" 00:31:50.729 ], 00:31:50.729 "product_name": "Logical Volume", 00:31:50.729 "block_size": 512, 00:31:50.729 "num_blocks": 204800, 00:31:50.729 "uuid": "aa33c6cb-d0e3-48d1-8c66-c7380e068fa4", 00:31:50.729 "assigned_rate_limits": { 00:31:50.729 "rw_ios_per_sec": 0, 00:31:50.729 "rw_mbytes_per_sec": 0, 00:31:50.729 "r_mbytes_per_sec": 0, 00:31:50.729 "w_mbytes_per_sec": 0 00:31:50.729 }, 00:31:50.729 "claimed": false, 00:31:50.729 "zoned": false, 00:31:50.729 "supported_io_types": { 00:31:50.729 "read": true, 00:31:50.729 "write": true, 00:31:50.729 "unmap": true, 00:31:50.729 "flush": false, 00:31:50.729 "reset": true, 00:31:50.729 "nvme_admin": false, 00:31:50.729 "nvme_io": false, 00:31:50.729 "nvme_io_md": false, 00:31:50.729 "write_zeroes": true, 00:31:50.729 "zcopy": false, 00:31:50.729 "get_zone_info": false, 00:31:50.729 "zone_management": false, 00:31:50.729 "zone_append": false, 00:31:50.729 "compare": false, 00:31:50.729 "compare_and_write": false, 00:31:50.729 "abort": false, 00:31:50.729 "seek_hole": true, 00:31:50.729 "seek_data": true, 00:31:50.729 "copy": false, 00:31:50.729 "nvme_iov_md": false 00:31:50.729 }, 00:31:50.729 "driver_specific": { 00:31:50.729 "lvol": { 00:31:50.729 "lvol_store_uuid": "5ac35e81-0595-49c9-a1b0-0fba797fc6a0", 00:31:50.729 "base_bdev": "Nvme0n1", 00:31:50.729 "thin_provision": true, 00:31:50.729 "num_allocated_clusters": 0, 00:31:50.729 "snapshot": false, 00:31:50.729 "clone": false, 00:31:50.729 "esnap_clone": false 00:31:50.729 } 00:31:50.729 } 00:31:50.729 } 00:31:50.729 ] 00:31:50.729 00:26:37 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:50.729 00:26:37 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:50.729 00:26:37 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:50.986 [2024-07-16 00:26:37.725993] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:50.986 COMP_lvs0/lv0 00:31:50.986 00:26:37 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:50.986 00:26:37 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:50.987 00:26:37 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:50.987 00:26:37 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:50.987 00:26:37 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:50.987 00:26:37 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:50.987 00:26:37 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:51.244 00:26:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:51.501 [ 00:31:51.501 { 00:31:51.501 "name": "COMP_lvs0/lv0", 00:31:51.501 "aliases": [ 00:31:51.501 "a859dfc2-d125-5294-a18f-c097ffb04f82" 00:31:51.501 ], 00:31:51.501 "product_name": "compress", 00:31:51.501 "block_size": 512, 00:31:51.501 "num_blocks": 200704, 00:31:51.501 "uuid": "a859dfc2-d125-5294-a18f-c097ffb04f82", 00:31:51.501 "assigned_rate_limits": { 00:31:51.501 "rw_ios_per_sec": 0, 00:31:51.501 "rw_mbytes_per_sec": 0, 00:31:51.501 "r_mbytes_per_sec": 0, 00:31:51.501 "w_mbytes_per_sec": 0 00:31:51.501 }, 00:31:51.501 "claimed": false, 00:31:51.501 "zoned": false, 00:31:51.501 "supported_io_types": { 00:31:51.501 "read": true, 00:31:51.501 "write": true, 00:31:51.501 "unmap": false, 00:31:51.501 "flush": false, 00:31:51.501 "reset": false, 00:31:51.501 "nvme_admin": false, 00:31:51.501 "nvme_io": false, 00:31:51.501 "nvme_io_md": false, 00:31:51.501 "write_zeroes": true, 00:31:51.501 "zcopy": false, 00:31:51.501 "get_zone_info": false, 00:31:51.501 "zone_management": false, 00:31:51.501 "zone_append": false, 00:31:51.501 "compare": false, 00:31:51.501 "compare_and_write": false, 00:31:51.501 "abort": false, 00:31:51.501 "seek_hole": false, 00:31:51.501 "seek_data": false, 00:31:51.501 "copy": false, 00:31:51.501 "nvme_iov_md": false 00:31:51.501 }, 00:31:51.501 "driver_specific": { 00:31:51.501 "compress": { 00:31:51.501 "name": "COMP_lvs0/lv0", 00:31:51.502 "base_bdev_name": "aa33c6cb-d0e3-48d1-8c66-c7380e068fa4" 00:31:51.502 } 00:31:51.502 } 00:31:51.502 } 00:31:51.502 ] 00:31:51.502 00:26:38 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:51.502 00:26:38 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:51.502 I/O targets: 00:31:51.502 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:51.502 00:31:51.502 00:31:51.502 CUnit - A unit testing framework for C - Version 2.1-3 00:31:51.502 http://cunit.sourceforge.net/ 00:31:51.502 00:31:51.502 00:31:51.502 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:51.502 Test: blockdev write read block ...passed 00:31:51.502 Test: blockdev write zeroes read block ...passed 00:31:51.502 Test: blockdev write zeroes read no split ...passed 00:31:51.502 Test: blockdev write zeroes read split ...passed 00:31:51.759 Test: blockdev write zeroes read split partial ...passed 00:31:51.759 Test: blockdev reset ...[2024-07-16 00:26:38.506642] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:51.759 passed 00:31:51.759 Test: blockdev write read 8 blocks ...passed 00:31:51.759 Test: blockdev write read size > 128k ...passed 00:31:51.759 Test: blockdev write read invalid size ...passed 00:31:51.759 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:51.759 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:51.759 Test: blockdev write read max offset ...passed 00:31:51.759 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:51.759 Test: blockdev writev readv 8 blocks ...passed 00:31:51.759 Test: blockdev writev readv 30 x 1block ...passed 00:31:51.759 Test: blockdev writev readv block ...passed 00:31:51.759 Test: blockdev writev readv size > 128k ...passed 00:31:51.759 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:51.759 Test: blockdev comparev and writev ...passed 00:31:51.759 Test: blockdev nvme passthru rw ...passed 00:31:51.759 Test: blockdev nvme passthru vendor specific ...passed 00:31:51.759 Test: blockdev nvme admin passthru ...passed 00:31:51.759 Test: blockdev copy ...passed 00:31:51.759 00:31:51.759 Run Summary: Type Total Ran Passed Failed Inactive 00:31:51.759 suites 1 1 n/a 0 0 00:31:51.759 tests 23 23 23 0 0 00:31:51.759 asserts 130 130 130 0 n/a 00:31:51.759 00:31:51.759 Elapsed time = 0.293 seconds 00:31:51.759 0 00:31:51.759 00:26:38 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:31:51.759 00:26:38 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:52.016 00:26:38 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:52.273 00:26:39 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:52.273 00:26:39 compress_isal -- compress/compress.sh@62 -- # killprocess 3672160 00:31:52.273 00:26:39 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 3672160 ']' 00:31:52.273 00:26:39 compress_isal -- common/autotest_common.sh@952 -- # kill -0 3672160 00:31:52.273 00:26:39 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:52.273 00:26:39 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:52.273 00:26:39 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3672160 00:31:52.273 00:26:39 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:52.273 00:26:39 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:52.273 00:26:39 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3672160' 00:31:52.273 killing process with pid 3672160 00:31:52.273 00:26:39 compress_isal -- common/autotest_common.sh@967 -- # kill 3672160 00:31:52.273 00:26:39 compress_isal -- common/autotest_common.sh@972 -- # wait 3672160 00:31:55.549 00:26:42 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:55.549 00:26:42 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:55.549 00:31:55.549 real 0m47.936s 00:31:55.549 user 1m51.329s 00:31:55.549 sys 0m4.492s 00:31:55.549 00:26:42 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:55.549 00:26:42 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:55.549 ************************************ 00:31:55.549 END TEST compress_isal 00:31:55.549 ************************************ 00:31:55.549 00:26:42 -- common/autotest_common.sh@1142 -- # return 0 00:31:55.549 00:26:42 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:31:55.549 00:26:42 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:31:55.549 00:26:42 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:55.549 00:26:42 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:55.549 00:26:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:55.549 00:26:42 -- common/autotest_common.sh@10 -- # set +x 00:31:55.549 ************************************ 00:31:55.549 START TEST blockdev_crypto_aesni 00:31:55.549 ************************************ 00:31:55.549 00:26:42 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:55.549 * Looking for test storage... 00:31:55.549 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3673639 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:55.549 00:26:42 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 3673639 00:31:55.549 00:26:42 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 3673639 ']' 00:31:55.549 00:26:42 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:55.549 00:26:42 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:55.549 00:26:42 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:55.549 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:55.549 00:26:42 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:55.549 00:26:42 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:55.549 [2024-07-16 00:26:42.368040] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:31:55.549 [2024-07-16 00:26:42.368138] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3673639 ] 00:31:55.549 [2024-07-16 00:26:42.498703] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:55.807 [2024-07-16 00:26:42.596554] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:56.372 00:26:43 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:56.372 00:26:43 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:31:56.372 00:26:43 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:31:56.372 00:26:43 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:31:56.372 00:26:43 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:31:56.372 00:26:43 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.372 00:26:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:56.372 [2024-07-16 00:26:43.306813] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:56.372 [2024-07-16 00:26:43.314848] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:56.372 [2024-07-16 00:26:43.322874] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:56.630 [2024-07-16 00:26:43.397401] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:59.160 true 00:31:59.160 true 00:31:59.160 true 00:31:59.160 true 00:31:59.160 Malloc0 00:31:59.160 Malloc1 00:31:59.160 Malloc2 00:31:59.160 Malloc3 00:31:59.160 [2024-07-16 00:26:45.800368] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:59.160 crypto_ram 00:31:59.160 [2024-07-16 00:26:45.808364] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:59.160 crypto_ram2 00:31:59.160 [2024-07-16 00:26:45.816384] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:59.160 crypto_ram3 00:31:59.160 [2024-07-16 00:26:45.824407] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:59.160 crypto_ram4 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.160 00:26:45 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.160 00:26:45 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:31:59.160 00:26:45 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.160 00:26:45 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.160 00:26:45 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.160 00:26:45 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:31:59.160 00:26:45 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:31:59.160 00:26:45 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:59.160 00:26:45 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.160 00:26:46 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:31:59.160 00:26:46 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:31:59.160 00:26:46 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "39c26cf9-44f1-537d-8d9d-6e4e74a49194"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "39c26cf9-44f1-537d-8d9d-6e4e74a49194",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "f7f3bc9c-1916-5687-a22f-4ae1d3021a51"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f7f3bc9c-1916-5687-a22f-4ae1d3021a51",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ff70128b-c2b0-5d2b-8a0a-f3ee55a8d758"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ff70128b-c2b0-5d2b-8a0a-f3ee55a8d758",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "1d478c03-18cf-58c2-8185-605f403e1339"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1d478c03-18cf-58c2-8185-605f403e1339",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:59.160 00:26:46 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:31:59.160 00:26:46 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:31:59.160 00:26:46 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:31:59.160 00:26:46 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 3673639 00:31:59.160 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 3673639 ']' 00:31:59.160 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 3673639 00:31:59.160 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:31:59.160 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:59.160 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3673639 00:31:59.160 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:59.160 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:59.161 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3673639' 00:31:59.161 killing process with pid 3673639 00:31:59.161 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 3673639 00:31:59.161 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 3673639 00:31:59.726 00:26:46 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:59.726 00:26:46 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:59.726 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:31:59.726 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:59.726 00:26:46 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:59.984 ************************************ 00:31:59.984 START TEST bdev_hello_world 00:31:59.984 ************************************ 00:31:59.984 00:26:46 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:59.984 [2024-07-16 00:26:46.775546] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:31:59.984 [2024-07-16 00:26:46.775605] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3674192 ] 00:31:59.984 [2024-07-16 00:26:46.903916] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:00.242 [2024-07-16 00:26:47.002115] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:00.242 [2024-07-16 00:26:47.023413] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:00.242 [2024-07-16 00:26:47.031440] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:00.242 [2024-07-16 00:26:47.039468] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:00.242 [2024-07-16 00:26:47.142955] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:02.764 [2024-07-16 00:26:49.372797] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:02.764 [2024-07-16 00:26:49.372865] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:02.764 [2024-07-16 00:26:49.372880] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.764 [2024-07-16 00:26:49.380819] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:02.764 [2024-07-16 00:26:49.380839] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:02.764 [2024-07-16 00:26:49.380851] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.764 [2024-07-16 00:26:49.388840] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:02.764 [2024-07-16 00:26:49.388859] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:02.764 [2024-07-16 00:26:49.388870] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.764 [2024-07-16 00:26:49.396861] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:02.764 [2024-07-16 00:26:49.396879] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:02.764 [2024-07-16 00:26:49.396891] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.764 [2024-07-16 00:26:49.474376] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:02.764 [2024-07-16 00:26:49.474420] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:02.764 [2024-07-16 00:26:49.474439] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:02.764 [2024-07-16 00:26:49.475710] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:02.764 [2024-07-16 00:26:49.475781] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:02.764 [2024-07-16 00:26:49.475798] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:02.765 [2024-07-16 00:26:49.475843] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:02.765 00:32:02.765 [2024-07-16 00:26:49.475862] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:03.022 00:32:03.022 real 0m3.182s 00:32:03.022 user 0m2.781s 00:32:03.022 sys 0m0.361s 00:32:03.022 00:26:49 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:03.022 00:26:49 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:03.022 ************************************ 00:32:03.022 END TEST bdev_hello_world 00:32:03.022 ************************************ 00:32:03.022 00:26:49 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:03.022 00:26:49 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:03.022 00:26:49 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:03.022 00:26:49 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:03.022 00:26:49 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:03.279 ************************************ 00:32:03.279 START TEST bdev_bounds 00:32:03.279 ************************************ 00:32:03.279 00:26:49 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:32:03.279 00:26:49 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3674563 00:32:03.280 00:26:49 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:03.280 00:26:49 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:03.280 00:26:49 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3674563' 00:32:03.280 Process bdevio pid: 3674563 00:32:03.280 00:26:49 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3674563 00:32:03.280 00:26:49 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 3674563 ']' 00:32:03.280 00:26:49 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:03.280 00:26:49 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:03.280 00:26:49 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:03.280 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:03.280 00:26:49 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:03.280 00:26:49 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:03.280 [2024-07-16 00:26:50.040499] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:32:03.280 [2024-07-16 00:26:50.040573] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3674563 ] 00:32:03.280 [2024-07-16 00:26:50.177930] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:03.537 [2024-07-16 00:26:50.292046] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:03.537 [2024-07-16 00:26:50.292088] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:03.537 [2024-07-16 00:26:50.292091] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:03.537 [2024-07-16 00:26:50.313639] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:03.537 [2024-07-16 00:26:50.321663] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:03.537 [2024-07-16 00:26:50.329688] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:03.537 [2024-07-16 00:26:50.436315] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:06.135 [2024-07-16 00:26:52.656843] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:06.135 [2024-07-16 00:26:52.656936] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:06.135 [2024-07-16 00:26:52.656951] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:06.136 [2024-07-16 00:26:52.664863] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:06.136 [2024-07-16 00:26:52.664882] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:06.136 [2024-07-16 00:26:52.664894] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:06.136 [2024-07-16 00:26:52.672884] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:06.136 [2024-07-16 00:26:52.672904] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:06.136 [2024-07-16 00:26:52.672921] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:06.136 [2024-07-16 00:26:52.680906] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:06.136 [2024-07-16 00:26:52.680923] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:06.136 [2024-07-16 00:26:52.680941] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:06.136 00:26:52 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:06.136 00:26:52 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:06.136 00:26:52 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:06.136 I/O targets: 00:32:06.136 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:32:06.136 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:32:06.136 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:32:06.136 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:32:06.136 00:32:06.136 00:32:06.136 CUnit - A unit testing framework for C - Version 2.1-3 00:32:06.136 http://cunit.sourceforge.net/ 00:32:06.136 00:32:06.136 00:32:06.136 Suite: bdevio tests on: crypto_ram4 00:32:06.136 Test: blockdev write read block ...passed 00:32:06.136 Test: blockdev write zeroes read block ...passed 00:32:06.136 Test: blockdev write zeroes read no split ...passed 00:32:06.136 Test: blockdev write zeroes read split ...passed 00:32:06.136 Test: blockdev write zeroes read split partial ...passed 00:32:06.136 Test: blockdev reset ...passed 00:32:06.136 Test: blockdev write read 8 blocks ...passed 00:32:06.136 Test: blockdev write read size > 128k ...passed 00:32:06.136 Test: blockdev write read invalid size ...passed 00:32:06.136 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:06.136 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:06.136 Test: blockdev write read max offset ...passed 00:32:06.136 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:06.136 Test: blockdev writev readv 8 blocks ...passed 00:32:06.136 Test: blockdev writev readv 30 x 1block ...passed 00:32:06.136 Test: blockdev writev readv block ...passed 00:32:06.136 Test: blockdev writev readv size > 128k ...passed 00:32:06.136 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:06.136 Test: blockdev comparev and writev ...passed 00:32:06.136 Test: blockdev nvme passthru rw ...passed 00:32:06.136 Test: blockdev nvme passthru vendor specific ...passed 00:32:06.136 Test: blockdev nvme admin passthru ...passed 00:32:06.136 Test: blockdev copy ...passed 00:32:06.136 Suite: bdevio tests on: crypto_ram3 00:32:06.136 Test: blockdev write read block ...passed 00:32:06.136 Test: blockdev write zeroes read block ...passed 00:32:06.136 Test: blockdev write zeroes read no split ...passed 00:32:06.136 Test: blockdev write zeroes read split ...passed 00:32:06.136 Test: blockdev write zeroes read split partial ...passed 00:32:06.136 Test: blockdev reset ...passed 00:32:06.136 Test: blockdev write read 8 blocks ...passed 00:32:06.136 Test: blockdev write read size > 128k ...passed 00:32:06.136 Test: blockdev write read invalid size ...passed 00:32:06.136 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:06.136 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:06.136 Test: blockdev write read max offset ...passed 00:32:06.136 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:06.136 Test: blockdev writev readv 8 blocks ...passed 00:32:06.136 Test: blockdev writev readv 30 x 1block ...passed 00:32:06.136 Test: blockdev writev readv block ...passed 00:32:06.136 Test: blockdev writev readv size > 128k ...passed 00:32:06.136 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:06.136 Test: blockdev comparev and writev ...passed 00:32:06.136 Test: blockdev nvme passthru rw ...passed 00:32:06.136 Test: blockdev nvme passthru vendor specific ...passed 00:32:06.136 Test: blockdev nvme admin passthru ...passed 00:32:06.136 Test: blockdev copy ...passed 00:32:06.136 Suite: bdevio tests on: crypto_ram2 00:32:06.136 Test: blockdev write read block ...passed 00:32:06.136 Test: blockdev write zeroes read block ...passed 00:32:06.136 Test: blockdev write zeroes read no split ...passed 00:32:06.393 Test: blockdev write zeroes read split ...passed 00:32:06.393 Test: blockdev write zeroes read split partial ...passed 00:32:06.393 Test: blockdev reset ...passed 00:32:06.393 Test: blockdev write read 8 blocks ...passed 00:32:06.393 Test: blockdev write read size > 128k ...passed 00:32:06.393 Test: blockdev write read invalid size ...passed 00:32:06.393 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:06.393 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:06.393 Test: blockdev write read max offset ...passed 00:32:06.393 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:06.393 Test: blockdev writev readv 8 blocks ...passed 00:32:06.393 Test: blockdev writev readv 30 x 1block ...passed 00:32:06.393 Test: blockdev writev readv block ...passed 00:32:06.649 Test: blockdev writev readv size > 128k ...passed 00:32:06.649 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:06.649 Test: blockdev comparev and writev ...passed 00:32:06.649 Test: blockdev nvme passthru rw ...passed 00:32:06.649 Test: blockdev nvme passthru vendor specific ...passed 00:32:06.649 Test: blockdev nvme admin passthru ...passed 00:32:06.649 Test: blockdev copy ...passed 00:32:06.649 Suite: bdevio tests on: crypto_ram 00:32:06.649 Test: blockdev write read block ...passed 00:32:06.649 Test: blockdev write zeroes read block ...passed 00:32:06.649 Test: blockdev write zeroes read no split ...passed 00:32:06.649 Test: blockdev write zeroes read split ...passed 00:32:06.907 Test: blockdev write zeroes read split partial ...passed 00:32:06.907 Test: blockdev reset ...passed 00:32:06.907 Test: blockdev write read 8 blocks ...passed 00:32:06.907 Test: blockdev write read size > 128k ...passed 00:32:06.907 Test: blockdev write read invalid size ...passed 00:32:06.907 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:06.907 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:06.907 Test: blockdev write read max offset ...passed 00:32:06.907 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:06.907 Test: blockdev writev readv 8 blocks ...passed 00:32:06.907 Test: blockdev writev readv 30 x 1block ...passed 00:32:06.907 Test: blockdev writev readv block ...passed 00:32:06.907 Test: blockdev writev readv size > 128k ...passed 00:32:06.907 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:06.907 Test: blockdev comparev and writev ...passed 00:32:06.907 Test: blockdev nvme passthru rw ...passed 00:32:06.907 Test: blockdev nvme passthru vendor specific ...passed 00:32:06.907 Test: blockdev nvme admin passthru ...passed 00:32:06.907 Test: blockdev copy ...passed 00:32:06.907 00:32:06.907 Run Summary: Type Total Ran Passed Failed Inactive 00:32:06.907 suites 4 4 n/a 0 0 00:32:06.907 tests 92 92 92 0 0 00:32:06.907 asserts 520 520 520 0 n/a 00:32:06.907 00:32:06.907 Elapsed time = 1.638 seconds 00:32:06.907 0 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3674563 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 3674563 ']' 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 3674563 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3674563 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3674563' 00:32:06.907 killing process with pid 3674563 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 3674563 00:32:06.907 00:26:53 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 3674563 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:07.473 00:32:07.473 real 0m4.204s 00:32:07.473 user 0m11.220s 00:32:07.473 sys 0m0.589s 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:07.473 ************************************ 00:32:07.473 END TEST bdev_bounds 00:32:07.473 ************************************ 00:32:07.473 00:26:54 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:07.473 00:26:54 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:07.473 00:26:54 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:07.473 00:26:54 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:07.473 00:26:54 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:07.473 ************************************ 00:32:07.473 START TEST bdev_nbd 00:32:07.473 ************************************ 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3675125 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3675125 /var/tmp/spdk-nbd.sock 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 3675125 ']' 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:07.473 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:07.473 00:26:54 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:07.473 [2024-07-16 00:26:54.334586] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:32:07.473 [2024-07-16 00:26:54.334650] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:07.731 [2024-07-16 00:26:54.466494] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:07.731 [2024-07-16 00:26:54.570969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:07.731 [2024-07-16 00:26:54.592338] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:07.731 [2024-07-16 00:26:54.600360] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:07.731 [2024-07-16 00:26:54.608378] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:08.002 [2024-07-16 00:26:54.715477] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:10.530 [2024-07-16 00:26:56.946878] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:10.530 [2024-07-16 00:26:56.946955] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:10.530 [2024-07-16 00:26:56.946972] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:10.530 [2024-07-16 00:26:56.954896] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:10.530 [2024-07-16 00:26:56.954916] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:10.530 [2024-07-16 00:26:56.954933] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:10.530 [2024-07-16 00:26:56.962917] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:10.530 [2024-07-16 00:26:56.962942] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:10.530 [2024-07-16 00:26:56.962953] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:10.530 [2024-07-16 00:26:56.970942] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:10.530 [2024-07-16 00:26:56.970960] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:10.530 [2024-07-16 00:26:56.970972] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:10.530 1+0 records in 00:32:10.530 1+0 records out 00:32:10.530 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286346 s, 14.3 MB/s 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:10.530 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:10.789 1+0 records in 00:32:10.789 1+0 records out 00:32:10.789 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256616 s, 16.0 MB/s 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:10.789 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:11.048 1+0 records in 00:32:11.048 1+0 records out 00:32:11.048 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000343694 s, 11.9 MB/s 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:11.048 00:26:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:11.307 1+0 records in 00:32:11.307 1+0 records out 00:32:11.307 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000321322 s, 12.7 MB/s 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:11.307 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:11.566 { 00:32:11.566 "nbd_device": "/dev/nbd0", 00:32:11.566 "bdev_name": "crypto_ram" 00:32:11.566 }, 00:32:11.566 { 00:32:11.566 "nbd_device": "/dev/nbd1", 00:32:11.566 "bdev_name": "crypto_ram2" 00:32:11.566 }, 00:32:11.566 { 00:32:11.566 "nbd_device": "/dev/nbd2", 00:32:11.566 "bdev_name": "crypto_ram3" 00:32:11.566 }, 00:32:11.566 { 00:32:11.566 "nbd_device": "/dev/nbd3", 00:32:11.566 "bdev_name": "crypto_ram4" 00:32:11.566 } 00:32:11.566 ]' 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:11.566 { 00:32:11.566 "nbd_device": "/dev/nbd0", 00:32:11.566 "bdev_name": "crypto_ram" 00:32:11.566 }, 00:32:11.566 { 00:32:11.566 "nbd_device": "/dev/nbd1", 00:32:11.566 "bdev_name": "crypto_ram2" 00:32:11.566 }, 00:32:11.566 { 00:32:11.566 "nbd_device": "/dev/nbd2", 00:32:11.566 "bdev_name": "crypto_ram3" 00:32:11.566 }, 00:32:11.566 { 00:32:11.566 "nbd_device": "/dev/nbd3", 00:32:11.566 "bdev_name": "crypto_ram4" 00:32:11.566 } 00:32:11.566 ]' 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:11.566 00:26:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:12.133 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:12.133 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:12.133 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:12.133 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:12.133 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:12.133 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:12.133 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:12.133 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:12.133 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:12.133 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:12.391 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:12.391 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:12.391 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:12.391 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:12.391 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:12.391 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:12.391 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:12.391 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:12.391 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:12.391 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:32:12.649 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:32:12.649 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:32:12.649 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:32:12.649 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:12.649 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:12.649 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:32:12.649 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:12.649 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:12.649 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:12.649 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:12.907 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:13.166 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:13.166 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:13.166 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:13.166 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:13.166 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:13.166 00:26:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:13.166 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:13.424 /dev/nbd0 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:13.424 1+0 records in 00:32:13.424 1+0 records out 00:32:13.424 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280923 s, 14.6 MB/s 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:13.424 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:13.425 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:13.425 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:13.425 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:13.425 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:13.425 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:13.425 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:32:13.683 /dev/nbd1 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:13.683 1+0 records in 00:32:13.683 1+0 records out 00:32:13.683 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0003149 s, 13.0 MB/s 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:13.683 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:32:13.941 /dev/nbd10 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:13.941 1+0 records in 00:32:13.941 1+0 records out 00:32:13.941 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000375566 s, 10.9 MB/s 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:13.941 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:14.200 00:27:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:32:14.200 /dev/nbd11 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:14.458 1+0 records in 00:32:14.458 1+0 records out 00:32:14.458 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245709 s, 16.7 MB/s 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:14.458 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:14.716 { 00:32:14.716 "nbd_device": "/dev/nbd0", 00:32:14.716 "bdev_name": "crypto_ram" 00:32:14.716 }, 00:32:14.716 { 00:32:14.716 "nbd_device": "/dev/nbd1", 00:32:14.716 "bdev_name": "crypto_ram2" 00:32:14.716 }, 00:32:14.716 { 00:32:14.716 "nbd_device": "/dev/nbd10", 00:32:14.716 "bdev_name": "crypto_ram3" 00:32:14.716 }, 00:32:14.716 { 00:32:14.716 "nbd_device": "/dev/nbd11", 00:32:14.716 "bdev_name": "crypto_ram4" 00:32:14.716 } 00:32:14.716 ]' 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:14.716 { 00:32:14.716 "nbd_device": "/dev/nbd0", 00:32:14.716 "bdev_name": "crypto_ram" 00:32:14.716 }, 00:32:14.716 { 00:32:14.716 "nbd_device": "/dev/nbd1", 00:32:14.716 "bdev_name": "crypto_ram2" 00:32:14.716 }, 00:32:14.716 { 00:32:14.716 "nbd_device": "/dev/nbd10", 00:32:14.716 "bdev_name": "crypto_ram3" 00:32:14.716 }, 00:32:14.716 { 00:32:14.716 "nbd_device": "/dev/nbd11", 00:32:14.716 "bdev_name": "crypto_ram4" 00:32:14.716 } 00:32:14.716 ]' 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:14.716 /dev/nbd1 00:32:14.716 /dev/nbd10 00:32:14.716 /dev/nbd11' 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:14.716 /dev/nbd1 00:32:14.716 /dev/nbd10 00:32:14.716 /dev/nbd11' 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:14.716 256+0 records in 00:32:14.716 256+0 records out 00:32:14.716 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114729 s, 91.4 MB/s 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:14.716 256+0 records in 00:32:14.716 256+0 records out 00:32:14.716 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0474131 s, 22.1 MB/s 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:14.716 256+0 records in 00:32:14.716 256+0 records out 00:32:14.716 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0662905 s, 15.8 MB/s 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:14.716 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:32:14.975 256+0 records in 00:32:14.975 256+0 records out 00:32:14.975 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.060642 s, 17.3 MB/s 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:32:14.975 256+0 records in 00:32:14.975 256+0 records out 00:32:14.975 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0430702 s, 24.3 MB/s 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:14.975 00:27:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:15.234 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:15.234 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:15.234 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:15.234 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:15.234 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:15.234 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:15.234 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:15.234 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:15.234 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:15.234 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:15.491 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:15.491 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:15.491 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:15.491 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:15.491 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:15.491 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:15.491 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:15.491 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:15.491 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:15.491 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:32:15.749 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:32:15.749 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:32:15.749 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:32:15.749 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:15.749 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:15.749 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:32:15.749 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:15.749 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:15.749 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:15.749 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:16.007 00:27:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:16.266 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:16.523 malloc_lvol_verify 00:32:16.523 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:16.780 893dafbd-5f27-45ff-b432-42eef5889d6d 00:32:16.780 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:17.038 addf6604-a181-4e6c-82f6-ed3b9f979fc3 00:32:17.038 00:27:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:17.295 /dev/nbd0 00:32:17.295 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:17.295 mke2fs 1.46.5 (30-Dec-2021) 00:32:17.295 Discarding device blocks: 0/4096 done 00:32:17.295 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:17.295 00:32:17.295 Allocating group tables: 0/1 done 00:32:17.295 Writing inode tables: 0/1 done 00:32:17.295 Creating journal (1024 blocks): done 00:32:17.295 Writing superblocks and filesystem accounting information: 0/1 done 00:32:17.295 00:32:17.295 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:17.295 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:17.295 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:17.295 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:17.295 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:17.295 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:17.295 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:17.295 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3675125 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 3675125 ']' 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 3675125 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3675125 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3675125' 00:32:17.554 killing process with pid 3675125 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 3675125 00:32:17.554 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 3675125 00:32:18.121 00:27:04 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:18.121 00:32:18.121 real 0m10.617s 00:32:18.121 user 0m14.196s 00:32:18.121 sys 0m4.322s 00:32:18.121 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:18.121 00:27:04 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:18.121 ************************************ 00:32:18.121 END TEST bdev_nbd 00:32:18.121 ************************************ 00:32:18.121 00:27:04 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:18.121 00:27:04 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:18.121 00:27:04 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:32:18.121 00:27:04 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:32:18.121 00:27:04 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:18.121 00:27:04 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:18.121 00:27:04 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:18.121 00:27:04 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:18.121 ************************************ 00:32:18.121 START TEST bdev_fio 00:32:18.121 ************************************ 00:32:18.121 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:32:18.121 00:27:04 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:18.121 00:27:04 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:18.121 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:18.121 00:27:04 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:18.121 00:27:04 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:18.122 00:27:04 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:18.122 00:27:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:18.381 ************************************ 00:32:18.381 START TEST bdev_fio_rw_verify 00:32:18.381 ************************************ 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:18.381 00:27:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:18.639 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:18.639 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:18.639 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:18.639 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:18.639 fio-3.35 00:32:18.639 Starting 4 threads 00:32:33.542 00:32:33.542 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3677671: Tue Jul 16 00:27:18 2024 00:32:33.542 read: IOPS=21.1k, BW=82.5MiB/s (86.5MB/s)(825MiB/10001msec) 00:32:33.542 slat (usec): min=16, max=499, avg=63.63, stdev=34.60 00:32:33.542 clat (usec): min=16, max=1683, avg=338.39, stdev=214.41 00:32:33.542 lat (usec): min=47, max=1819, avg=402.02, stdev=234.55 00:32:33.542 clat percentiles (usec): 00:32:33.542 | 50.000th=[ 289], 99.000th=[ 1074], 99.900th=[ 1237], 99.990th=[ 1369], 00:32:33.542 | 99.999th=[ 1647] 00:32:33.542 write: IOPS=23.3k, BW=90.9MiB/s (95.3MB/s)(885MiB/9736msec); 0 zone resets 00:32:33.542 slat (usec): min=23, max=274, avg=76.70, stdev=34.05 00:32:33.542 clat (usec): min=34, max=1839, avg=412.10, stdev=252.12 00:32:33.542 lat (usec): min=75, max=1983, avg=488.80, stdev=271.46 00:32:33.542 clat percentiles (usec): 00:32:33.542 | 50.000th=[ 367], 99.000th=[ 1319], 99.900th=[ 1565], 99.990th=[ 1663], 00:32:33.542 | 99.999th=[ 1778] 00:32:33.542 bw ( KiB/s): min=65296, max=123664, per=97.57%, avg=90783.47, stdev=4257.99, samples=76 00:32:33.542 iops : min=16324, max=30916, avg=22695.84, stdev=1064.49, samples=76 00:32:33.542 lat (usec) : 20=0.01%, 50=0.01%, 100=5.88%, 250=28.51%, 500=42.22% 00:32:33.542 lat (usec) : 750=15.80%, 1000=5.01% 00:32:33.542 lat (msec) : 2=2.57% 00:32:33.542 cpu : usr=99.61%, sys=0.00%, ctx=62, majf=0, minf=301 00:32:33.542 IO depths : 1=10.0%, 2=25.6%, 4=51.3%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:33.542 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:33.542 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:33.542 issued rwts: total=211185,226463,0,0 short=0,0,0,0 dropped=0,0,0,0 00:32:33.542 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:33.542 00:32:33.542 Run status group 0 (all jobs): 00:32:33.542 READ: bw=82.5MiB/s (86.5MB/s), 82.5MiB/s-82.5MiB/s (86.5MB/s-86.5MB/s), io=825MiB (865MB), run=10001-10001msec 00:32:33.542 WRITE: bw=90.9MiB/s (95.3MB/s), 90.9MiB/s-90.9MiB/s (95.3MB/s-95.3MB/s), io=885MiB (928MB), run=9736-9736msec 00:32:33.542 00:32:33.542 real 0m13.569s 00:32:33.542 user 0m46.370s 00:32:33.542 sys 0m0.524s 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:32:33.542 ************************************ 00:32:33.542 END TEST bdev_fio_rw_verify 00:32:33.542 ************************************ 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "39c26cf9-44f1-537d-8d9d-6e4e74a49194"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "39c26cf9-44f1-537d-8d9d-6e4e74a49194",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "f7f3bc9c-1916-5687-a22f-4ae1d3021a51"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f7f3bc9c-1916-5687-a22f-4ae1d3021a51",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ff70128b-c2b0-5d2b-8a0a-f3ee55a8d758"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ff70128b-c2b0-5d2b-8a0a-f3ee55a8d758",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "1d478c03-18cf-58c2-8185-605f403e1339"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1d478c03-18cf-58c2-8185-605f403e1339",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:32:33.542 crypto_ram2 00:32:33.542 crypto_ram3 00:32:33.542 crypto_ram4 ]] 00:32:33.542 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "39c26cf9-44f1-537d-8d9d-6e4e74a49194"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "39c26cf9-44f1-537d-8d9d-6e4e74a49194",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "f7f3bc9c-1916-5687-a22f-4ae1d3021a51"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f7f3bc9c-1916-5687-a22f-4ae1d3021a51",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ff70128b-c2b0-5d2b-8a0a-f3ee55a8d758"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ff70128b-c2b0-5d2b-8a0a-f3ee55a8d758",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "1d478c03-18cf-58c2-8185-605f403e1339"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1d478c03-18cf-58c2-8185-605f403e1339",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:33.543 ************************************ 00:32:33.543 START TEST bdev_fio_trim 00:32:33.543 ************************************ 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:33.543 00:27:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:33.543 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:33.543 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:33.543 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:33.543 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:33.543 fio-3.35 00:32:33.543 Starting 4 threads 00:32:45.735 00:32:45.735 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3679526: Tue Jul 16 00:27:32 2024 00:32:45.735 write: IOPS=36.7k, BW=143MiB/s (150MB/s)(1434MiB/10001msec); 0 zone resets 00:32:45.735 slat (usec): min=11, max=1335, avg=61.82, stdev=32.82 00:32:45.735 clat (usec): min=33, max=1930, avg=274.53, stdev=171.02 00:32:45.735 lat (usec): min=50, max=2094, avg=336.35, stdev=191.57 00:32:45.735 clat percentiles (usec): 00:32:45.735 | 50.000th=[ 231], 99.000th=[ 857], 99.900th=[ 1029], 99.990th=[ 1156], 00:32:45.735 | 99.999th=[ 1598] 00:32:45.735 bw ( KiB/s): min=121272, max=202893, per=100.00%, avg=147041.95, stdev=7274.82, samples=76 00:32:45.735 iops : min=30318, max=50723, avg=36760.47, stdev=1818.70, samples=76 00:32:45.735 trim: IOPS=36.7k, BW=143MiB/s (150MB/s)(1434MiB/10001msec); 0 zone resets 00:32:45.735 slat (usec): min=4, max=140, avg=17.89, stdev= 7.81 00:32:45.735 clat (usec): min=5, max=1787, avg=258.56, stdev=120.71 00:32:45.735 lat (usec): min=25, max=1802, avg=276.45, stdev=123.85 00:32:45.735 clat percentiles (usec): 00:32:45.735 | 50.000th=[ 237], 99.000th=[ 603], 99.900th=[ 725], 99.990th=[ 816], 00:32:45.735 | 99.999th=[ 1139] 00:32:45.735 bw ( KiB/s): min=121280, max=202917, per=100.00%, avg=147042.79, stdev=7275.30, samples=76 00:32:45.735 iops : min=30320, max=50729, avg=36760.58, stdev=1818.89, samples=76 00:32:45.735 lat (usec) : 10=0.01%, 50=0.57%, 100=7.68%, 250=46.52%, 500=37.86% 00:32:45.735 lat (usec) : 750=6.32%, 1000=0.97% 00:32:45.735 lat (msec) : 2=0.08% 00:32:45.735 cpu : usr=99.61%, sys=0.00%, ctx=60, majf=0, minf=107 00:32:45.735 IO depths : 1=7.7%, 2=26.4%, 4=52.8%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:45.735 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:45.735 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:45.735 issued rwts: total=0,367103,367104,0 short=0,0,0,0 dropped=0,0,0,0 00:32:45.735 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:45.735 00:32:45.735 Run status group 0 (all jobs): 00:32:45.735 WRITE: bw=143MiB/s (150MB/s), 143MiB/s-143MiB/s (150MB/s-150MB/s), io=1434MiB (1504MB), run=10001-10001msec 00:32:45.735 TRIM: bw=143MiB/s (150MB/s), 143MiB/s-143MiB/s (150MB/s-150MB/s), io=1434MiB (1504MB), run=10001-10001msec 00:32:45.735 00:32:45.735 real 0m13.559s 00:32:45.735 user 0m45.940s 00:32:45.735 sys 0m0.535s 00:32:45.735 00:27:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:45.735 00:27:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:45.735 ************************************ 00:32:45.735 END TEST bdev_fio_trim 00:32:45.735 ************************************ 00:32:45.735 00:27:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:45.735 00:27:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:32:45.735 00:27:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:45.735 00:27:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:32:45.735 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:45.735 00:27:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:32:45.735 00:32:45.735 real 0m27.485s 00:32:45.735 user 1m32.498s 00:32:45.735 sys 0m1.252s 00:32:45.735 00:27:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:45.735 00:27:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:45.735 ************************************ 00:32:45.735 END TEST bdev_fio 00:32:45.735 ************************************ 00:32:45.735 00:27:32 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:45.735 00:27:32 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:45.735 00:27:32 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:45.735 00:27:32 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:45.735 00:27:32 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:45.735 00:27:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:45.735 ************************************ 00:32:45.735 START TEST bdev_verify 00:32:45.735 ************************************ 00:32:45.735 00:27:32 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:45.735 [2024-07-16 00:27:32.599321] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:32:45.735 [2024-07-16 00:27:32.599383] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3680944 ] 00:32:45.993 [2024-07-16 00:27:32.729524] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:45.993 [2024-07-16 00:27:32.832198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:45.993 [2024-07-16 00:27:32.832214] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:45.993 [2024-07-16 00:27:32.853834] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:45.993 [2024-07-16 00:27:32.861864] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:45.993 [2024-07-16 00:27:32.869892] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:46.251 [2024-07-16 00:27:32.968802] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:48.783 [2024-07-16 00:27:35.182197] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:48.783 [2024-07-16 00:27:35.182283] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:48.783 [2024-07-16 00:27:35.182304] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:48.783 [2024-07-16 00:27:35.190220] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:48.783 [2024-07-16 00:27:35.190250] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:48.783 [2024-07-16 00:27:35.190268] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:48.783 [2024-07-16 00:27:35.198244] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:48.783 [2024-07-16 00:27:35.198268] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:48.783 [2024-07-16 00:27:35.198286] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:48.783 [2024-07-16 00:27:35.206267] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:48.783 [2024-07-16 00:27:35.206290] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:48.783 [2024-07-16 00:27:35.206308] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:48.783 Running I/O for 5 seconds... 00:32:54.046 00:32:54.046 Latency(us) 00:32:54.046 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:54.046 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:54.046 Verification LBA range: start 0x0 length 0x1000 00:32:54.046 crypto_ram : 5.07 479.99 1.87 0.00 0.00 266127.08 5214.39 160477.72 00:32:54.046 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:54.046 Verification LBA range: start 0x1000 length 0x1000 00:32:54.046 crypto_ram : 5.08 385.97 1.51 0.00 0.00 329877.76 2436.23 199685.34 00:32:54.046 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:54.046 Verification LBA range: start 0x0 length 0x1000 00:32:54.046 crypto_ram2 : 5.07 479.89 1.87 0.00 0.00 265405.73 5299.87 146800.64 00:32:54.046 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:54.046 Verification LBA range: start 0x1000 length 0x1000 00:32:54.046 crypto_ram2 : 5.08 388.95 1.52 0.00 0.00 326527.31 3034.60 182361.04 00:32:54.046 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:54.046 Verification LBA range: start 0x0 length 0x1000 00:32:54.046 crypto_ram3 : 5.06 3721.36 14.54 0.00 0.00 34100.41 5328.36 25872.47 00:32:54.046 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:54.046 Verification LBA range: start 0x1000 length 0x1000 00:32:54.046 crypto_ram3 : 5.06 3007.47 11.75 0.00 0.00 42125.99 3960.65 30317.52 00:32:54.046 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:54.046 Verification LBA range: start 0x0 length 0x1000 00:32:54.046 crypto_ram4 : 5.06 3720.55 14.53 0.00 0.00 34027.21 5527.82 25416.57 00:32:54.046 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:54.046 Verification LBA range: start 0x1000 length 0x1000 00:32:54.046 crypto_ram4 : 5.06 3008.03 11.75 0.00 0.00 42013.29 4160.11 30089.57 00:32:54.046 =================================================================================================================== 00:32:54.046 Total : 15192.22 59.34 0.00 0.00 66942.57 2436.23 199685.34 00:32:54.046 00:32:54.046 real 0m8.288s 00:32:54.046 user 0m15.716s 00:32:54.046 sys 0m0.374s 00:32:54.046 00:27:40 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:54.046 00:27:40 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:54.046 ************************************ 00:32:54.046 END TEST bdev_verify 00:32:54.046 ************************************ 00:32:54.046 00:27:40 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:54.046 00:27:40 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:54.046 00:27:40 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:54.046 00:27:40 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:54.046 00:27:40 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:54.046 ************************************ 00:32:54.046 START TEST bdev_verify_big_io 00:32:54.046 ************************************ 00:32:54.046 00:27:40 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:54.046 [2024-07-16 00:27:40.985506] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:32:54.046 [2024-07-16 00:27:40.985574] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3682008 ] 00:32:54.304 [2024-07-16 00:27:41.120001] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:54.304 [2024-07-16 00:27:41.223599] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:54.305 [2024-07-16 00:27:41.223604] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:54.305 [2024-07-16 00:27:41.245004] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:54.305 [2024-07-16 00:27:41.253035] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:54.563 [2024-07-16 00:27:41.261062] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:54.563 [2024-07-16 00:27:41.365762] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:57.091 [2024-07-16 00:27:43.589247] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:57.091 [2024-07-16 00:27:43.589346] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:57.091 [2024-07-16 00:27:43.589368] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:57.091 [2024-07-16 00:27:43.597267] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:57.091 [2024-07-16 00:27:43.597300] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:57.091 [2024-07-16 00:27:43.597319] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:57.091 [2024-07-16 00:27:43.605283] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:57.091 [2024-07-16 00:27:43.605307] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:57.091 [2024-07-16 00:27:43.605325] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:57.091 [2024-07-16 00:27:43.613309] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:57.091 [2024-07-16 00:27:43.613332] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:57.091 [2024-07-16 00:27:43.613350] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:57.091 Running I/O for 5 seconds... 00:32:58.068 [2024-07-16 00:27:44.628345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.068 [2024-07-16 00:27:44.628952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.068 [2024-07-16 00:27:44.629180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.068 [2024-07-16 00:27:44.629261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.068 [2024-07-16 00:27:44.629325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.068 [2024-07-16 00:27:44.629706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.631192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.631260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.631314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.631367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.631897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.631968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.632025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.632079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.632494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.634251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.634338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.634392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.634447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.635020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.635090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.635145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.635200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.635646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.068 [2024-07-16 00:27:44.636984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.637050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.637104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.637169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.637885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.637954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.638008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.638062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.638436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.639908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.639987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.640044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.640097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.640674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.640739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.640794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.640847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.641339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.643066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.643141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.643218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.643284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.643884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.643954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.644008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.644073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.644618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.646078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.646143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.646196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.646253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.646960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.647020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.647074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.647128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.647510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.648682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.648749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.648802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.648855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.649392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.649458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.649511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.649569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.649902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.651265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.651333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.651387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.651440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.651938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.652007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.652060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.652112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.652478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.653645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.653707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.653759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.653811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.654349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.654413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.654466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.654538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.656414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.656485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.656538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.656595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.657093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.657153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.657205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.657257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.658771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.658833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.658885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.658951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.659440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.659500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.659567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.659620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.661385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.661460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.661514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.661572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.662072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.662136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.662189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.662241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.663754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.663815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.663867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.663934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.664418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.664477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.664546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.664599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.666320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.666388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.666446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.666509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.667010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.667070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.069 [2024-07-16 00:27:44.667122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.667179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.668666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.668729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.668781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.668841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.669342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.669402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.669477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.669530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.671260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.671335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.671390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.671448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.671946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.672011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.672064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.672117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.673593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.673655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.673708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.673767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.674264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.674324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.674392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.674445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.676196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.676259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.676317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.676372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.676862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.676934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.676988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.677040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.678514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.678577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.678629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.678681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.679210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.679271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.679324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.679378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.681176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.681243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.681303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.681356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.681848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.681911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.681972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.682025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.683473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.683535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.683588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.683640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.684224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.684288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.684340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.684409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.686281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.686345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.686396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.686455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.686959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.687023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.687076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.687128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.688586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.688648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.688700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.688753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.689287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.689355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.689409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.689462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.691424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.691492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.691544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.691596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.692145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.692212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.692269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.692323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.693765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.693827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.693879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.693940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.694497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.694555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.694627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.694680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.696737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.696799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.696856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.696908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.697450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.697516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.070 [2024-07-16 00:27:44.697592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.697647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.699064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.699126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.699179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.699231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.699750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.699809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.699870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.699932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.701897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.701972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.702026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.702078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.702604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.702675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.702729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.702786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.704218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.704290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.704343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.704395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.704969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.705029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.705081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.705140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.707035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.707107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.707159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.707211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.707745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.707803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.707862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.707916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.709402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.709484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.709537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.709589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.710139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.710197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.710250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.710303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.711910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.713974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.715956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.717862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.718359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.720406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.722444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.724489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.728156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.729980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.731542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.733524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.735899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.737733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.738242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.739253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.742133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.743923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.745707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.747513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.748512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.750223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.751998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.753837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.757198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.759001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.759500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.759995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.762376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.764197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.765903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.767943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.769803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.770861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.772653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.774454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.776287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.778086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.779884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.781679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.785025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.786835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.788633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.789957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.792242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.794064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.794757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.795252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.798541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.800432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.802343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.804293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.805235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.805854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.807636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.809437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.812659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.814481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.071 [2024-07-16 00:27:44.815963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.816457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.818697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.820501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.822286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.823605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.826107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.826608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.828528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.830467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.832903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.834541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.836320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.838106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.841724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.843773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.845807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.847841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.850059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.851849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.853686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.854187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.857571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.859265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.861303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.863337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.865806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.866310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.867095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.868861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.872087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.873908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.875720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.876972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.878872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.880647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.882437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.884227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.887562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.888333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.888826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.890876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.893313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.895369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.897010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.898781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.900798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.902664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.904476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.906287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.908751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.910800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.912836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.914883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.918584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.920413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.922076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.924112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.926593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.928543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.929041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.929921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.932630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.934522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.936339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.938055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.938992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.939496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.941545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.942049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.944200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.944707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.945208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.945701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.946740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.947255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.947755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.948261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.950391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.950893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.951401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.951898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.952991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.953495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.953990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.954482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.956657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.072 [2024-07-16 00:27:44.957170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.957665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.958162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.959180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.959680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.960184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.960691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.962907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.963419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.963940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.964442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.965482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.965996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.966489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.967007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.969471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.969985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.970476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.970972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.972111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.972614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.973116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.973650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.976192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.976714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.977218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.977715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.978786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.979312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.073 [2024-07-16 00:27:44.979813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.980322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.982800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.983322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.983828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.984342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.985423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.985938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.986434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.986938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.989583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.990100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.351 [2024-07-16 00:27:44.990595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.991106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.992258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.992772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.993295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.993790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.996273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.996778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.997297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.997790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.998885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:44.999395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.001005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.002494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.004465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.004995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.007032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.008239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.009217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.010580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.012245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.014260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.017250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.017751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.018913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.020690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.023226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.024343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.026227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.028055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.031440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.033354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.035336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.037280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.039497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.041322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.043363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.043857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.047433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.048917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.050961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.053002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.055306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.055813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.056680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.058470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.061734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.063549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.065576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.066660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.068525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.070318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.072142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.074155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.077779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.078421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.078912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.080953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.083427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.085208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.087029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.088872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.090889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.092673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.094468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.094520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.096540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.096998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.099005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.100814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.102852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.104217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.105938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.106002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.106060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.106116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.106454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.108413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.108477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.108531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.108601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.110031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.110093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.110145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.110197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.110721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.110900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.110964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.111034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.111088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.112641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.112703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.112756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.112808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.113151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.113333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.113390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.113443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.113495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.352 [2024-07-16 00:27:45.114906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.114978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.115031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.115084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.115558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.115739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.115802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.115868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.115922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.117475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.117537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.117589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.117642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.117986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.118165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.118226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.118279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.118332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.119751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.119814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.119866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.119919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.120445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.120624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.120682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.120736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.120790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.122356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.122419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.122471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.122523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.122857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.123049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.123107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.123160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.123212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.124628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.124693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.124746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.124802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.125282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.125464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.125522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.125579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.125632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.127196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.127259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.127311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.127364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.127697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.127877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.127945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.127999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.128051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.129457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.129519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.129571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.129622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.130051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.130230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.130288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.130342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.130410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.132035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.132099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.132151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.132203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.132544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.132725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.132784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.132837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.132890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.134314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.134377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.134430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.134486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.134898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.135087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.135144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.135198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.135271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.136895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.136966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.137019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.137071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.137408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.137587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.137648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.137702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.137754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.139176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.139239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.139291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.139343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.139794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.139982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.140042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.140101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.140168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.141778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.353 [2024-07-16 00:27:45.141840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.141892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.141951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.142288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.142467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.142525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.142578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.142631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.144112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.144174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.144226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.144283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.144691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.144870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.144933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.144988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.145042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.146624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.146687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.146739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.146791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.147135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.147315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.147384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.147437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.147490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.149019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.149082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.149144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.149197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.149608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.149788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.149845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.149900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.149961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.151544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.151607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.151659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.151712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.152055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.152237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.152300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.152354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.152407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.153951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.154013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.154065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.154122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.154601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.154779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.154835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.154889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.154950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.156570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.156637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.156689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.156741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.157085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.157266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.157332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.157393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.157447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.158971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.159041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.159093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.159145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.159585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.159761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.159817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.159870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.159923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.161511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.161579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.161632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.161684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.162027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.162205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.162261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.162323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.162378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.163906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.163985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.164039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.164096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.164488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.164667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.164724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.164777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.164829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.166447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.166514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.166566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.166619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.166963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.167139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.167199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.167260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.167315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.168872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.168962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.169019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.354 [2024-07-16 00:27:45.169072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.169475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.169652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.169709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.169762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.169814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.171429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.171495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.171551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.171604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.171945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.172128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.172184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.172236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.172295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.173872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.173941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.174004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.174061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.174426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.174610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.174667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.174720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.174772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.176334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.176403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.176461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.176513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.176848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.177039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.177101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.177153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.177211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.178801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.178864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.178934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.178988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.179333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.179516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.179573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.179625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.179678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.181257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.181324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.181396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.181448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.181781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.181967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.182025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.182083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.182135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.183757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.183819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.185857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.186277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.186459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.186518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.186593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.186646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.188190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.190245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.191354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.193147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.193539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.193714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.195742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.196250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.196739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.199726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.201723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.203740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.205781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.206126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.206734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.207580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.209373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.211175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.214417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.216457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.217612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.218113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.218455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.220342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.222147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.224182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.225518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.227448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.227956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.229938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.231872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.232216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.233924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.235951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.238000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.240039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.243838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.355 [2024-07-16 00:27:45.245804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.247841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.249399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.249739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.251789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.253808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.255728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.256224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.258768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.260579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.261674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.262164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.262506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.264338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.266060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.267133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.268855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.271091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.271599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.272099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.272590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.273141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.273750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.274253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.274745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.275243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.277631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.278138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.278632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.279139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.279653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.280281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.280794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.281295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.281786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.283934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.284440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.284942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.285432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.285851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.286473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.286979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.287472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.287973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.290073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.290595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.291094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.291592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.292091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.292697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.293218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.293713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.294211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.296385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.296898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.297414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.297940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.298409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.356 [2024-07-16 00:27:45.299037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.299563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.300077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.300575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.303174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.303679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.304180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.304681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.305241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.305850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.306381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.306889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.307385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.309904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.615 [2024-07-16 00:27:45.310425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.310920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.311432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.311948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.312563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.313066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.313562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.314061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.316603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.317109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.317603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.318104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.318625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.319237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.319746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.320258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.320747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.323272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.323773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.324282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.324772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.325225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.327394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.327900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.329944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.331509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.334160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.335490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.335989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.336661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.337039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.339204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.339708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.340208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.342053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.345298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.347332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.347832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.348400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.348742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.350673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.352483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.353859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.355646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.357492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.358717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.360507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.362326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.362722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.364193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.365994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.367788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.369601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.372826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.374645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.376461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.377875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.378277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.380200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.381993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.382567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.383069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.386476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.388278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.390100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.392033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.392393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.393011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.393665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.395437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.397264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.400465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.402275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.403826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.404327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.404721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.406615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.408397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.410177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.411518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.414115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.414616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.416232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.418020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.418402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.420332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.421753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.423534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.425342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.428545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.430407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.432309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.434277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.434674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.436570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.438375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.440163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.440661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.444239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.616 [2024-07-16 00:27:45.446288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.447989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.449760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.450177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.452190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.452695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.453192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.454979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.458333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.460385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.462420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.464459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.464945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.465561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.467417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.469222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.471027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.474408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.476307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.476802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.477653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.478059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.479991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.481796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.483142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.484916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.486813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.488283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.490069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.491879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.492264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.493742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.495519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.497340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.499160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.502459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.504383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.506346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.507897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.508286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.510199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.512004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.512498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.512998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.516254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.518216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.520195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.522241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.522584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.523206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.523876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.525663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.527474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.530662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.532491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.534021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.534518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.534932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.536832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.538647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.540463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.541833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.544406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.544909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.546598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.548376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.548770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.550728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.552184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.553961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.555757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.559078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.561008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.563049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.563124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.617 [2024-07-16 00:27:45.563500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.878 [2024-07-16 00:27:45.565653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.567639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.569524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.571285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.575055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.575125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.575179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.575231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.575616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.577095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.577161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.577213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.577280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.578709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.578791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.578857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.578911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.579367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.579548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.579605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.579659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.579712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.581149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.581215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.581270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.581322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.581716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.581900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.581964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.582018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.582070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.583619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.583685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.583740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.583829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.584395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.584581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.584638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.584691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.584745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.586271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.586338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.586407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.586460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.586851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.587045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.587107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.587160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.587212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.588755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.588827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.588882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.588942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.589509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.589689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.589746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.589798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.589851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.591364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.591434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.591487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.591545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.591883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.592069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.592131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.592184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.592236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.593728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.593791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.593844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.593905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.594405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.594587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.594663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.594718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.594771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.596341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.596404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.596456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.596516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.596851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.597043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.597104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.597158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.597216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.598630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.598693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.598745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.598797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.599239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.599422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.599480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.599547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.599612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.601173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.601252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.879 [2024-07-16 00:27:45.601304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.601356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.601749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.601941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.602002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.602055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.602107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.603500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.603569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.603621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.603674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.604065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.604247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.604305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.604359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.604411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.605988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.606052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.606104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.606156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.606553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.606731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.606792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.606853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.606906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.608432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.608501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.608553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.608605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.609023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.609204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.609260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.609315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.609369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.611023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.611095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.611148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.611201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.611580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.611759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.611816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.611868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.611921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.613410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.613473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.613527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.613587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.613924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.614110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.614167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.614233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.614288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.615977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.616040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.616092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.616145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.616510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.616687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.616754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.616807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.616859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.618310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.618373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.618426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.618478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.618855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.619046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.619103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.619169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.619227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.620912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.620986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.621039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.621092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.621495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.621677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.621733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.621785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.621845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.623386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.623449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.623501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.623561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.623993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.624176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.624233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.624293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.624360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.626102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.626178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.626247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.626301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.626638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.626820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.626885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.626959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.627013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.628646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.628709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.880 [2024-07-16 00:27:45.628765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.628817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.629211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.629396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.629466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.629519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.629572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.631203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.631267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.631320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.631388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.631917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.632128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.632191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.632257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.632324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.634068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.634133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.634185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.634238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.634725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.634907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.634987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.635043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.635108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.636804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.636878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.636940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.636992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.637417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.637596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.637674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.637729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.637782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.639462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.639542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.639606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.639659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.640152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.640339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.640409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.640463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.640516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.642293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.642357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.642411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.642464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.643012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.643201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.643271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.643328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.643393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.645119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.645184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.645236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.645288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.645738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.645919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.645999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.646053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.646118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.647795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.647861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.647914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.647974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.648467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.648647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.648721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.648774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.648826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.650456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.650532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.650586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.650639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.651215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.651399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.651456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.651509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.651563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.653288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.653351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.653404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.653473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.654027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.654208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.654281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.654334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.654400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.656266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.656338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.656403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.656912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.657382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.657560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.657617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.657671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.657726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.659569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.660090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.660589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.881 [2024-07-16 00:27:45.661112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.661600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.661778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.662286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.662785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.663289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.665536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.666053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.666550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.667051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.667563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.668180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.668683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.669195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.669689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.672001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.672507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.673009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.673507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.673983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.674590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.675100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.675605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.676107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.678220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.678725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.679249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.679747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.680310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.680920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.681430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.681933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.682425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.684795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.685312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.685810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.686311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.686849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.687469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.687982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.688480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.688981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.692847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.694900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.696964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.698653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.699000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.700982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.702972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.704914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.705414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.708182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.709525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.710850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.711355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.711819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.713784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.715543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.716050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.716546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.719407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.721180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.723228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.723732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.724231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.726200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.728140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.730144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.732075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.735638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.736152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.736667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.738440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.738783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.740858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.742225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.744207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.746261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.748637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.750411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.752217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.754256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.754701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.756597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.758417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.760452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.761610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.765141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.767180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.768283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.770078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.770499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.772660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.773323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.773823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.775593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.778820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.780666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.782582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.784601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.882 [2024-07-16 00:27:45.785100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.785710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.787650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.789549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.791576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.795041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.796619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.797155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.798100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.798477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.800407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.802439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.803562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.805352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.807435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.809164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.810940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.812779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.813125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.814732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.816522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.818323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.820360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.823978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.883 [2024-07-16 00:27:45.826045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.142 [2024-07-16 00:27:45.827668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.142 [2024-07-16 00:27:45.829721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.142 [2024-07-16 00:27:45.830077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.832229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.833943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.834440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.835169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.837801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.839590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.841381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.843407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.843890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.844507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.846104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.847876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.849664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.853048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.855091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.855595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.856096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.856442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.858599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.860648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.862252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.864224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.866165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.866997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.868768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.870570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.870912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.872135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.873921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.875717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.877749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.881275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.883069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.885080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.886180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.886558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.888469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.890491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.891250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.891750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.895317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.896834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.898630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.900425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.900767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.901388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.901891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.903891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.905837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.909289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.911331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.912883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.913400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.913885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.915796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.917612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.919657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.920735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.923095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.923602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.925192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.926960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.927332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.929513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.930904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.932680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.934459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.937946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.939914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.941903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.943838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.944187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.946079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.947868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.949910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.950413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.954037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.955596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.957649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.959697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.960048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.961982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.962484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.963136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.964910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.968240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.970049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.972075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.973190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.973674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.974887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.976657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.978447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.143 [2024-07-16 00:27:45.980472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.984082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.984783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.985287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.986967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.987346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.989279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.991325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.992828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.994599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.996643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:45.998666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.000676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.002702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.003049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.005112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.007153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.009194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.011020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.014707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.016754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.018063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.019967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.020310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.022466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.023862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.024361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.025374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.027986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.029770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.031571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.033571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.034031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.034651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.036348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.038143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.039988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.043325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.045376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.045877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.046380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.046726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.048824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.050857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.052398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.054324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.056279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.056788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.057290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.057787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.058242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.059596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.061392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.062245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.064216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.068005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.070024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.071980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.072041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.072428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.074311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.076084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.076716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.077218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.079480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.079561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.079615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.079682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.080158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.080781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.080850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.080904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.080977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.083023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.083087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.083141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.083194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.083756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.083948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.084018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.084074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.084139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.085836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.085900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.085960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.086013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.086545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.086748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.086805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.086871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.086934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.088727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.088811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.144 [2024-07-16 00:27:46.088870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.145 [2024-07-16 00:27:46.088944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.145 [2024-07-16 00:27:46.089370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.145 [2024-07-16 00:27:46.089556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.145 [2024-07-16 00:27:46.089614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.145 [2024-07-16 00:27:46.089668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.145 [2024-07-16 00:27:46.089720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.145 [2024-07-16 00:27:46.091673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.145 [2024-07-16 00:27:46.091743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.145 [2024-07-16 00:27:46.091796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.145 [2024-07-16 00:27:46.091850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.092403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.092599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.092675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.092751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.092816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.094526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.094592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.094645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.094701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.095204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.095393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.095449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.095517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.095573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.097258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.097326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.097384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.097448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.097962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.098151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.098209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.098264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.098319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.100253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.100319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.100372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.100424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.100834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.101028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.101086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.101139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.101191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.103412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.103477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.103535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.103588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.104041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.104224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.104285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.104338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.104407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.106270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.106335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.106387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.106440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.106932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.107121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.107178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.107230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.107283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.108971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.109036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.109092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.109144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.109483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.109666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.109724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.109777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.109832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.111731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.111809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.111874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.111935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.112287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.112477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.112535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.405 [2024-07-16 00:27:46.112589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.112641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.114378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.114454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.114508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.114574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.115032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.115212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.115273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.115327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.115386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.117090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.117155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.117208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.117260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.117721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.117905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.117972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.118026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.118079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.119949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.120013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.120078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.120133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.120606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.120788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.120857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.120911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.120972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.122808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.122873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.122932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.122985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.123434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.123615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.123685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.123739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.123804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.125379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.125443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.125496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.125549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.125932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.126114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.126177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.126230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.126282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.127979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.128042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.128100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.128155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.128651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.128833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.128901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.128989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.129057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.130969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.131031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.131083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.131141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.131611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.131798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.131854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.131906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.131980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.133740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.133807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.133861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.133919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.134264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.134446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.134503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.134556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.134611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.136284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.136347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.136399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.136452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.136789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.136981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.137045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.137099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.137153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.138944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.139007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.139069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.139123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.139601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.139779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.139835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.139892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.139952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.141873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.141942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.141996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.142048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.142525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.142702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.142766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.142818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.142870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.406 [2024-07-16 00:27:46.144484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.144547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.144611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.144664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.145008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.145188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.145244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.145299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.145352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.147364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.147430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.147498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.147553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.147912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.148101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.148166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.148220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.148274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.150007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.150069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.150133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.150190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.150528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.150711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.150769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.150834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.150889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.152371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.152435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.152487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.152539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.152878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.153069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.153139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.153193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.153246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.154870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.154940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.155002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.155068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.155469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.155646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.155702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.155765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.155818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.217598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.217681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.219429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.228992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.230794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.232822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.234857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.238553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.240589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.241965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.243934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.246472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.248075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.248565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.249347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.251709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.253494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.255516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.257542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.258591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.260186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.261969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.264010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.267600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.269504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.270000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.270517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.273005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.275056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.276064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.277920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.279892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.281273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.283048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.285074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.286680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.288483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.290507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.292550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.296079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.298101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.299674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.301410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.303947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.305790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.306286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.306841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.309159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.310944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.312958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.314997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.316081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.317174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.407 [2024-07-16 00:27:46.318923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.320951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.324603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.326641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.327145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.327641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.329912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.331990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.333589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.335290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.337262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.338068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.339845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.341899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.343239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.345046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.347087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.349122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.408 [2024-07-16 00:27:46.352640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.354768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.356363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.358124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.360650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.362358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.362856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.363585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.365954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.367735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.369756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.371786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.372858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.374524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.376313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.378332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.381899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.383815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.384314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.384829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.387299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.389338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.390431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.392397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.394371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.395685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.397445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.399460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.401029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.402814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.404834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.406872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.410399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.412449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.414099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.415775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.418301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.420183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.420673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.421197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.423505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.425298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.427339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.429360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.430466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.431855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.433657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.435704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.439308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.441333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.441825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.442325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.444814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.446859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.448002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.450034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.452021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.453302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.455067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.457090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.458587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.460369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.462387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.464420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.467864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.469910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.471652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.473179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.475702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.477742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.478244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.478732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.481187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.482997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.485021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.485714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.486741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.487254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.487746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.488328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.669 [2024-07-16 00:27:46.491794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.493837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.495823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.496326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.499002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.501043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.502842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.504246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.506400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.507652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.508611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.510093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.511169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.511671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.512173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.512665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.515085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.515590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.516090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.516582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.517667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.518194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.518689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.519186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.521677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.522194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.522688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.523187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.524262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.524765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.525268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.525764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.527989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.528492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.528993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.529490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.530642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.531181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.531681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.532179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.532209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.532597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.534686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.534764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.535276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.535339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.536447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.536547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.537051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.539213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.539288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.539783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.539851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.540553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.541067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.541129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.541617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.544165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.544237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.544728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.544787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.545515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.546028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.546104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.546595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.548889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.548967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.549459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.549520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.550084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.550585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.550646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.551140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.553383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.553460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.553964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.554036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.670 [2024-07-16 00:27:46.554735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.555241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.555297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.555787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.558212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.558284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.558774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.558828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.559480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.559994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.560058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.560547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.562932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.563001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.564222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.564280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.564308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.564639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.564825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.565577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.565639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.567657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.569509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.571319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.571379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.571453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.671 [2024-07-16 00:27:46.571757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.571946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.573060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.573127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.574557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.576365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.576438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.576499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.576561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.576945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.577135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.577202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.577276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.577349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.579214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.579286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.579347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.579409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.579786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.579984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.580050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.580118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.580183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.581716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.581796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.581859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.581934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.582499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.582689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.582755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.582817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.582880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.584562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.584634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.584715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.584776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.585125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.585320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.585386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.585475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.585538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.587146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.587218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.587288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.587349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.587727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.587921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.587993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.588067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.588130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.589682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.589753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.589816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.589878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.590447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.590636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.590704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.671 [2024-07-16 00:27:46.590769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.590830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.592350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.592423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.592484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.592545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.592885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.593086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.593152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.593213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.593273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.594870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.594963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.595030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.595092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.595442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.595629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.595694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.595762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.595841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.597420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.597492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.597553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.597613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.598047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.598236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.598307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.598371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.598431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.600222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.600300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.600362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.600424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.600766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.600962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.601027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.601087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.601148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.602660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.602732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.602793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.602856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.603208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.603396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.603460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.603520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.603581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.605302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.672 [2024-07-16 00:27:46.605381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.931 [2024-07-16 00:27:46.770365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.771054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.772820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.782915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.784967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.787008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.800359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.800870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.801367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.811610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.813571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.815501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.827583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.829623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.830888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.840572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.841197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.841690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.851406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.853477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.854932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.865805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.867178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.868571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.873949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.874452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.874949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.880568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.881087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.931 [2024-07-16 00:27:46.881608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.887154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.887194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.887685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.888186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.890668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.891179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.891675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.892176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.893341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.893848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.894349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.894840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.897286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.191 [2024-07-16 00:27:46.897803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.898308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.898800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.899841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.900349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.900844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.901345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.903630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.904140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.904636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.905137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.906269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.906777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.907278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.907771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.909912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.910429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.910936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.911426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.912442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.912950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.913446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.913962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.916141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.916644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.917144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.917637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.918813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.919332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.920876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.921528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.925096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.926223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.927737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.929269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.930353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.930854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.932728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.934645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.936641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.937400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.938855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.940110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.942496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.943343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.944732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.945741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.949333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.951385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.952555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.954249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.955708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.955774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.957539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.958532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.960482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.962505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.962572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.964604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.965123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.966807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.968703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.968770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.970553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.971848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.971910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.973672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.976142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.978066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.978129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.979981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.983402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.983479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.984582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.985704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.987944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.988022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.990059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.991924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.993475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.995217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.997234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.997295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.997944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:46.998920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.000738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.000804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.003988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.005882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.005955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.007993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.010511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.011476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.011536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.012574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.016096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.016166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.017893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.192 [2024-07-16 00:27:47.019645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.022193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.022263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.024128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.024840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.026406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.028199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.030226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.030289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.030839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.032615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.034369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.034427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.037871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.038943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.039004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.040779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.043340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.044484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.044544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.046331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.048640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.048708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.050305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.051503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.053798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.053866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.055878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.057043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.058579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.060152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.060785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.060842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.061341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.062763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.064549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.064607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.067903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.067976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.069760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.069824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.072199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.073053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.073112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.074264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.077783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.079235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.079296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.081059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.081088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.081512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.083657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.083724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.085691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.085775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.085802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.086296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.089130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.089198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.091056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.091121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.091427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:00.193 [2024-07-16 00:27:47.091626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:00.193 [2024-07-16 00:27:47.093499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:00.193 [2024-07-16 00:27:47.093574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:00.193 [2024-07-16 00:27:47.093644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:00.193 [2024-07-16 00:27:47.095135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.095199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.095251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.095304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.095764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.095963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.096021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.096074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.096127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.097696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.097766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.097835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.097889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.098236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.098425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.098482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.098534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.098590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.100246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.100314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.100375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.100433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.100771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.100969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.101029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.101082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.101135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.102728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.102798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.102853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.102910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.103257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.103447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.103504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.193 [2024-07-16 00:27:47.103556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.103609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.105178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.105242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.105295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.105356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.105694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.105882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.105960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.106016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.106070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.107655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.107718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.107777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.107830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.108185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.108378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.108434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.108487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.108540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.110076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.110140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.110192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.110248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.110587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.110776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.110840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.110898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.110959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.112490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.112559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.112616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.112668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.113061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.113251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.113309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.113362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.113415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.114972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.115037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.115095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.115151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.115487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.115676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.115732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.115786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.115844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.117382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.117452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.117504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.117557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.117900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.118108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.118168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.118221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.118274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.119690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.119754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.119807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.119859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.120276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.122189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.122256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.122309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.122367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.123950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.124038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.124093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.124146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.124544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.124734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.124790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.124850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.124905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.126510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.126575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.126627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.126685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.194 [2024-07-16 00:27:47.127027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.127221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.127278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.127331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.127384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.128805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.128868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.128920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.128981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.129408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.129597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.129654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.129707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.129766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.131488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.131552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.131605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.131665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.132011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.132202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.132262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.132314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.132371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.133876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.133954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.134008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.134061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.134438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.134637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.134704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.134757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.134810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.136476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.136553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.136608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.136662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.137128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.137317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.137375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.137427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.137479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.139076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.139146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.139203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.139256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.139626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.139816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.139886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.139961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.195 [2024-07-16 00:27:47.140028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.141545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.141614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.141667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.141720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.142087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.142849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.142923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.142989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.144751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.146254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.148048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.148115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.149933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.150281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.150464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.150529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.152063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.152123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.153916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.155733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.155794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.155852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.156259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.156449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.455 [2024-07-16 00:27:47.158234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.158293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.158351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.161317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.161390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.161445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.162942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.163317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.164131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.164197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.164254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.166017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.167459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.167525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.169300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.169359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.169753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.169944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.170003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.172016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.172080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.173669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.174846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.174906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.174965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.175335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.175521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.177534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.177600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.177653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.181108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.181178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.181230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.182591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.182936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.184362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.184429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.184482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.184984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.186394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.186458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.187544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.187603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.187983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.188164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.188221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.190013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.190078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.191718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.193283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.193343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.193396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.193734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.193917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.195819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.195886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.195949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.199326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.199403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.199460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.201481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.201873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.203857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.203923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.203982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.205595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.207255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.207324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.209349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.209408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.209802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.209998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.210060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.211822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.211880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.214827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.214898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.216776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.216841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.217381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.217568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.219332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.219393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.219446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.221021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.222810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.222872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.222920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.223331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.225499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.225568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.227385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.228025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.229556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.230376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.230609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.232395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.233762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.235583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.456 [2024-07-16 00:27:47.236087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.238150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.238601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.238784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.240278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.240770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.241842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.245310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.247210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.249272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.250889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.251451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.253598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.254485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.256246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.257536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.260718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.262167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.262959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.264615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.265004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.265744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.267532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.269352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.271086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.274381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.275541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.276605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.278427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.278958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.280914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.281426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.281933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.283509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.285866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.286833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.288098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.289871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.290426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.291055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.291723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.293302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.294795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.298421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.299289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.300657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.301160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.301582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.303360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.303955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.305601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.306103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.309684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.310203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.310704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.311663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.312090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.313940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.314444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.314955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.315787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.317759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.318275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.320162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.321351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.321806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.322431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.322943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.324969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.325804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.328797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.329488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.331294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.331790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.332261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.333609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.334576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.336640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.337145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.340169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.340913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.341413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.341911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.342261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.343624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.344574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.345076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.345570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.347787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.348303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.350077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.350687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.351041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.351654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.352168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.353705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.355882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.457 [2024-07-16 00:27:47.356635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.358113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.359693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.360241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.360435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.360952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.361755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.363182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.365304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.367215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.368369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.369377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.369793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.370421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.372490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.373346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.374717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.377262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.379252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.379749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.380251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.380669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.381902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.383853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.384364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.384862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.387143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.387647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.388192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.389872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.390283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.391346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.391843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.392982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.394118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.397304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.399178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.401174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.403070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.458 [2024-07-16 00:27:47.403424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.405096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.407125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.408528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.409338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.412948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.414574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.415933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.417169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.417523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.419222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.419828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.420333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.420983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.424221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.426129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.428089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.429237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.429649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.431582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.432719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.434157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.435761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.439499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.440024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.441622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.443168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.443670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.445841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.447899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.449645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.451422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.453909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.455718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.457769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.457831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.458379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.458563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.460335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.462355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.462414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.464443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.464512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.466277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.466336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.466718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.468869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.469935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.469994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.471765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.474417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.476378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.476446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.477232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.477648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.479592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.479665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.481694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.482737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.485410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.485479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.486516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.488518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.489079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.489264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.491042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.492817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.492876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.494409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.496210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.498246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.498306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.498722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.499958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.501923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.501998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.502779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.505309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.507104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.507165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.508944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.718 [2024-07-16 00:27:47.509292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.510685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.510752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.511674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.513739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.517337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.517411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.518454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.520231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.520614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.520800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.522826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.524083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.524141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.525951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.527742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.529551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.529610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.529959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.531162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.532959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.533020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.534798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.537920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.538447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.538506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.540278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.540655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.542828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.542894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.543974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.545743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.548770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.548838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.550625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.551130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.551472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.551658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.553453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.555501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.555558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.557065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.559115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.559175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.560430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.560868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.562896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.563462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.563521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.565293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.566804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.568594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.570398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.570456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.570799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.572184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.572250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.573083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.575145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.578717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.579844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.581615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.581673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.582099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.582283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.584323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.584382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.584444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.586558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.586627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.586680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.586733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.587131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.587314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.589132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.589199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.589256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.590711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.590779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.590836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.590888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.591238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.591421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.591477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.591530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.591592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.593144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.593207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.593259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.593312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.593655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.593835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.593899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.593961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.594021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.595428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.595496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.595547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.595604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.719 [2024-07-16 00:27:47.596018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.596205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.596261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.596314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.596374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.597796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.597861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.597914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.597973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.598318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.598503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.598570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.598624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.598677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.600163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.600226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.600278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.600331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.600685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.600871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.600935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.600989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.601052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.602478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.602541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.602594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.602646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.603148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.603333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.603395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.603450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.603511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.605087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.605150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.605202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.605266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.605617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.605803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.605865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.605918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.605979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.607503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.607566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.607618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.607672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.608094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.608280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.608340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.608393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.608446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.610118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.610181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.610232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.610286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.610820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.611016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.611073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.611126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.611179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.612680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.612757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.612812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.612864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.613214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.613399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.613462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.613515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.613567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.615105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.615169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.615221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.615277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.615829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.617794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.617858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.617911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.617976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.619511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.619578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.619631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.619683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.620118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.620308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.620364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.620418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.620470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.621982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.622052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.622107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.622160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.622516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.622698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.622755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.622810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.622863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.624495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.624561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.624613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.624665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.625073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.720 [2024-07-16 00:27:47.625257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.625314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.625367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.625432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.626923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.627000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.627057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.627114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.627453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.627640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.627697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.627749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.627813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.629374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.629438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.629491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.629545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.629885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.630077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.630141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.630196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.630258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.631690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.631757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.631809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.631862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.632288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.632469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.632525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.632578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.632638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.634075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.634139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.634192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.634245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.634587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.634767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.634830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.634884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.634945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.636467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.636530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.636583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.638628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.639057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.640963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.641034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.641087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.643080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.644787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.646378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.646437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.647945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.648353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.648536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.648600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.650644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.650704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.652178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.652251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.654281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.654340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.654794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.654982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.656400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.656460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.656519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.658131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.660042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.660103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.660155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.660499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.661678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.661743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.661796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.663605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.666237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.666346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.721 [2024-07-16 00:27:47.666421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.667930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.668279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.668465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.668529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.670435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.670499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.675824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.675897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.677438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.677514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.677853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.678040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.679457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.679517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.679570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.681139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.683205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.683265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.683318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.683790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.685673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.685740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.685794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.687585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.690888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.690969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.691023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.691867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.692268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.692448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.692505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.694316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.694373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.698675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.698753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.700520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.700586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.701133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.701313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.702653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.702713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.702767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.708107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.709884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.709949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.710009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.710351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.712032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.712099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.712160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.713001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.717257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.717327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.718636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.718693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.719115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.719297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.719354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.721158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.721221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.728327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.728396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.728449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.729982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.730364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.730544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.732485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.732552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.732605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.737456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.737521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.737591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.739440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.739883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.741176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.741241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.742269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.743448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.747492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.749374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.751321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.753339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.753735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.755652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.760729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.762165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.763568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.765396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.765854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.766039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.767022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.768266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.770034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.777092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.778568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.779324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.980 [2024-07-16 00:27:47.779819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.780299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.782391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.783478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.784626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.785139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.790910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.791950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.793152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.794964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.795479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.796106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.796754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.798345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.799747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.805472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.805981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.806485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.807804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.808322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.810469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.810980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.811479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.812477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.818067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.819206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.820303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.820796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.821248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.821875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.822387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.822882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.823397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.826469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.826981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.827491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.827992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.828462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.829084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.829590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.830096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.830593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.834101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.834620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.835137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.835635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.836132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.836750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.837273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.837774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.838285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.841371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.841879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.842384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.842884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.843310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.843935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.844441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.844947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.845447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.848177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.848686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.849189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.849684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.850148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.850768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.851280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.853131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.855177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.858613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.859775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.860854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.862622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.863051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.865098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.866279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.868059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.869861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.875728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.877543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.878056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.878553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.878999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.881145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.882125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.883969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.888803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.890274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.891843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.892342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.892827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.893018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.895069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.895742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.897267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.900349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.901069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.902849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.904310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.904657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.981 [2024-07-16 00:27:47.906400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.906899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.907876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.909650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.914576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.915356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.917125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.919007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.919353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.920575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.922358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.924240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:00.982 [2024-07-16 00:27:47.926249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.932132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.933703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.935531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.937357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.937705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.938331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.939003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.940790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.942590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.948461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.948968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.950393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.952173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.952599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.954760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.956015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.957807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.959590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.965465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.967354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.969016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.970800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.971208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.973368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.973870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.974414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.976174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.983500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.985047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.985540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.986652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.987075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.989002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.991051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.992124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.993935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:47.999562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.001363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.003172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.003240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.003585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.004803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.006584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.008393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.010422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.016706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.016780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.018807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.018864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.019277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.019473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.021253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.023063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.023121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.240 [2024-07-16 00:27:48.028828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.030863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.030938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.032410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.032832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.034748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.036785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.036847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.037345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.044848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.044924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.046945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.048983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.049335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.049974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.050045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.051311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.053096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.058507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.060548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.061254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.061310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.061804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.061999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.064046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.066093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.066154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.072995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.073496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.073554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.074460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.074848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.076775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.078811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.078871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.079909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.085766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.085836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.087714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.089640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.089996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.091702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.091767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.093658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.095588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.097334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.099133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.100921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.100983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.101326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.101510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.102557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.104322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.104380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.110201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.112003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.112062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.114098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.114464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.116369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.118184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.118246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.120210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.125695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.125762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.127685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.129582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.129934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.131485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.131552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.132050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.133406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.137778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.139580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.139641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.141677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.142206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.142390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.142924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.144713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.144770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.150359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.152401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.153726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.153784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.154138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.155942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.156448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.156508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.158284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.165628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.167298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.167792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.167848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.168273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.170160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.170227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.172023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.174043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.241 [2024-07-16 00:27:48.179812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.179881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.179944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.179996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.180341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.180526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.182358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.182416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.182474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.186964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.187035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.187087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.187147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.187544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.187726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.188457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.188524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.242 [2024-07-16 00:27:48.188577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.192342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.192411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.192464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.192516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.193008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.193195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.193252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.193306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.193359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.197689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.197759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.197813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.197884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.198439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.198624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.198681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.198734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.198787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.204396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.204476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.204533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.204587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.205087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.205274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.205334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.205387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.205441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.209663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.209728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.209780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.209833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.210288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.210477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.210535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.210591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.210645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.214634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.214699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.214751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.214804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.215235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.501 [2024-07-16 00:27:48.215424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.215484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.215537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.215605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.221374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.221439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.221498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.221559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.221905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.222101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.222166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.222219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.222273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.226099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.226163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.226222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.226276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.226637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.226821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.226891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.226952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.227011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.231054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.231123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.231175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.231228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.231630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.231820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.231882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.231941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.231994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.236007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.236084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.236139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.236192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.236537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.236719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.236782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.236836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.236901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.239657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.239728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.239780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.239839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.240295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.240487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.240544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.240595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.240648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.243464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.243529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.243582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.243653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.244183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.244805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.244888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.244959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.245032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.248118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.248184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.248239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.248293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.248714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.248901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.248966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.249019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.249088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.251852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.251938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.251994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.252047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.252488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.252677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.252747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.252803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.252857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.255845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.255910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.255971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.256030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.256554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.256750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.256807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.256874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.256937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.259669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.259735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.259788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.259844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.260285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.260473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.260529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.260582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.260653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.263512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.263579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.263632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.263701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.264238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.264420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.502 [2024-07-16 00:27:48.264477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.264531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.264590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.267452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.267522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.267599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.267664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.268215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.268397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.268455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.268508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.268561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.271283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.271359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.271415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.271913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.272368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.272553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.272610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.272663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.272728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.275556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.276072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.276131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.276624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.277060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.277683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.277750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.277805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.278322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.281175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.281246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.281743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.281799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.282252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.282436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.282508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.283110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.283172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.287036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.289089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.289149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.289209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.289632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.289815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.291270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.291332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.291390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.296798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.296868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.296921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.298253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.298677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.299303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.299369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.299423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.299914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.303917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.303989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.304936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.304996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.305374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.305556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.305622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.307414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.307471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.310027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.310541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.310603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.310657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.311205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.311392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.313441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.313516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.313583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.319360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.319431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.319485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.321313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.321772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.322740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.322805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.322861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.323360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.326697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.326762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.328768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.328830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.329178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.329368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.329427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.331217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.331275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.335055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.336569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.336632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.336687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.337037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.337220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.338805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.503 [2024-07-16 00:27:48.338864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.338918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.344367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.344438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.346211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.346268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.346642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.348802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.348869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.348924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.349426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.356544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.356614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.356666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.358460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.358805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.358992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.359050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.359545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.359599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.365232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.365298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.365354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.367124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.367514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.367693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.369728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.369787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.369840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.373739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.375307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.377348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.379397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.379743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.381628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.381693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.382184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.383313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.389476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.391520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.392302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.392791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.393143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.395212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.402487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.403727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.404226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.405627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.406036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.406102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.407882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.409891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.410978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.416880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.418665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.420497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.422535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.422965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.424855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.426661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.428705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.429212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.436289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.438126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.440001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.442042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.442577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.443201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.444991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.446853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.504 [2024-07-16 00:27:48.448934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.455011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.456269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.458104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.459913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.460268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.461689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.463483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.464615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.466639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.472516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.473744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.475530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.477331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.477678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.478311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.478812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.480747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.482654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.489504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.490007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.490769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.492522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.492939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.495108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.496164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.497940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.499716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.505564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.507602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.508851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.510629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.511025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.513210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.513717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.514215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.516258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.523293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.525238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.525733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.526453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.526862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.528783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.530827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.531896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.533655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.539444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.541263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.543303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.544384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.544795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.546724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.548752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.549358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.549845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.556888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.558686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.560705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.561204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.561743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.563861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.565778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.567810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.569355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.575241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.577036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.578823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.580858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.581287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.583467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.763 [2024-07-16 00:27:48.585448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.587474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.593625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.594897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.596895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.598826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.599191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.599376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.599873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.600382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.601019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.605713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.607520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.608803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.609304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.609695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.611585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.613380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.615410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.616495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.621612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.622884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.624314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.625101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.625551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.627006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.628810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.630825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.631540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.637026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.639076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.640138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.641570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.641918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.643370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.643870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.644799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.646588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.652165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.654098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.655953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.657084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.657516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.659326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.660311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.660808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.662425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.667478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.669411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.669914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.671774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.672279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.672898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.673409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.673907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.674406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.677600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.678112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.678612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.678672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.679130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.679753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.680265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.680765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.681268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.684582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.684654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.685168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.685228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.685745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.686377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.686880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.687382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.687875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.691387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.691890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.691957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.692450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.692991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.693183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.693680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.694192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.694259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.697627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.697698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.698202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.698692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.699182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.699800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.700314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.700378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.700867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.703854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.704367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.704863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.704956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.705447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.706081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.706156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.764 [2024-07-16 00:27:48.706648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.765 [2024-07-16 00:27:48.707166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.765 [2024-07-16 00:27:48.710455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.765 [2024-07-16 00:27:48.710997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.765 [2024-07-16 00:27:48.711085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.765 [2024-07-16 00:27:48.711588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.765 [2024-07-16 00:27:48.712111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:01.765 [2024-07-16 00:27:48.712305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.713932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.715971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.716039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.721325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.721400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.723394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.724594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.724993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.726913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.727704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.727764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.728267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.731749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.732701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.734724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.734784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.735279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.737387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.737461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.024 [2024-07-16 00:27:48.739123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.739636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.744724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.745566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.745628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.746128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.746585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.746770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.747934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.749474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.749536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.753562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.753633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.754133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.754630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.755030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.755725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.757795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.757866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.758372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.761517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.762847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.762910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.764142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.764641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.765273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.765341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.766951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.767947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.770504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.771958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.773747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.773806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.774205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.774397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.774896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.776656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.776719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.781370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.781880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.782384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.782446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.782791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.784938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.786991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.787052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.788576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.792497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.792567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.792619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.792672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.793029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.794934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.795001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.797011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.798893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.802197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.802262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.802328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.802382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.802865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.803055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.804608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.804668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.804735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.808122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.808188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.808244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.808298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.808732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.808916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.810940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.811000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.811053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.814613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.814681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.814733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.814786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.815141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.815325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.815381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.815442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.815495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.818829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.818896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.818967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.819022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.819512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.819694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.819752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.819806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.819859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.821350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.821414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.821472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.821525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.821970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.822161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.822217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.822270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.025 [2024-07-16 00:27:48.822323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.825858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.825924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.825986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.826039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.826384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.826569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.826625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.826685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.826738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.828239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.828304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.828356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.828413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.828798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.828996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.829054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.829115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.829168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.831294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.831359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.831411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.831464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.831852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.832051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.832117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.832175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.832228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.833761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.833824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.833876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.833937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.834320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.834505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.834575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.834628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.834680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.837402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.837467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.837520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.837580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.837922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.838113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.838170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.838222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.838275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.839877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.839950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.840003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.840068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.840412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.840599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.840673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.840739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.840793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.842490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.842561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.842614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.842673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.843024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.843215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.843272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.843324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.843377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.845127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.845191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.845253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.845307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.845667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.846307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.846374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.846428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.846486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.848106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.848170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.848222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.848274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.848741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.848934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.848993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.849045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.849098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.850690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.850755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.850807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.850879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.851395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.851582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.851639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.851693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.851749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.853290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.853353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.853406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.853459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.853959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.854152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.854208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.854261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.026 [2024-07-16 00:27:48.854314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.855902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.855989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.856046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.856098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.856523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.856709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.856767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.856822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.856879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.858363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.858427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.858479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.858531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.859029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.859218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.859274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.859327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.859392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.860909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.860993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.861047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.861540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.861887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.862080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.862146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.862201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.862260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.863816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.865607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.865669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.867471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.867816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.868011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.868077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.868132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.868186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.870145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.870213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.872012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.872071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.872415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.873719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.873786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.873839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.875609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.877284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.877976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.878043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.878096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.878485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.878666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.878722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.880558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.880621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.883877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.883960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.884019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.885641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.886222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.886410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.887290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.887350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.887407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.891123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.891189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.892975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.893033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.893404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.895463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.895529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.895583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.896085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.899478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.900619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.900679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.900731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.901109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.901290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.901359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.903408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.903467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.907271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.907340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.907393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.909189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.909533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.909717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.911039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.911099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.911151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.912708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.912771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.913294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.913356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.913774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.915670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.915737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.915790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.027 [2024-07-16 00:27:48.917578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.919172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.920901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.920967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.921021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.921372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.921552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.921609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.922132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.922194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.925714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.925784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.926848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.926905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.927283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.927464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.929273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.929336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.929398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.933095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.933166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.933219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.934509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.934894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.936694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.936761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.936815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.937319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.939030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.939106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.939175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.941225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.941715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.941897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.941964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.942477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.942537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.943991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.946016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.947921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.949978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.950474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.950653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.951182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.951245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.951297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.954502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.955036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.955535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.956611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.957023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.958360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.958434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.960500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.962544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.965987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.968019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.969239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.971080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.971492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.028 [2024-07-16 00:27:48.972171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.975420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.977017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.978886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.980778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.981163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.981250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.981761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.982267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.984200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.987219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.989262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.990260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.990757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.991195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.993095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.994373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.996051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:48.997848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.001153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.002848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.003870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.005639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.006034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.007528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.008040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.009119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.010540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.012842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.013356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.014783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.015580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.015937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.016552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.017067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.017567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.018071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.020482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.020999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.021502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.022006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.022491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.023120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.023636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.024144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.024645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.026871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.027386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.027886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.028398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.028886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.029511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.030027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.030530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.031035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.033245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.033755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.034260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.034759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.035241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.035860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.036375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.036878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.037390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.039802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.040318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.040815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.041316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.041806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.042438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.042952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.043451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.043959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.046093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.046596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.047104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.047602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.048069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.048687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.049205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.049702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.050205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.052594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.053112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.053609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.054115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.054544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.056213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.056730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.057562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.287 [2024-07-16 00:27:49.059341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.062493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.064538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.066165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.066673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.067165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.068977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.071025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.072459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.074480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.075000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.075500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.076003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.076354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.076536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.078549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.080604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.081892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.083885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.085265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.086106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.087967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.088496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.089124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.089847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.091345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.092568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.096381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.098307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.099511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.100847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.101259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.101877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.102387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.104268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.105078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.109018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.109616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.111243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.111740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.112282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.114453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.116115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.117161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.118515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.121969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.122528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.124569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.125085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.125566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.127510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.129438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.131100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.132881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.136474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.138340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.140016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.140076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.140442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.142495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.143005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.143515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.145297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.148570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.148640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.150541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.150597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.150948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.151569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.152082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.154095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.156115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.159282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.161128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.161202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.163047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.163551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.164194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.165971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.167761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.169800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.173260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.173330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.174590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.175087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.175509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.175696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.177606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.179454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:02.288 [2024-07-16 00:27:49.179512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.221 00:33:03.221 Latency(us) 00:33:03.221 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:03.221 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:03.221 Verification LBA range: start 0x0 length 0x100 00:33:03.221 crypto_ram : 5.84 43.80 2.74 0.00 0.00 2837943.87 74312.13 2553054.61 00:33:03.221 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:03.221 Verification LBA range: start 0x100 length 0x100 00:33:03.221 crypto_ram : 6.07 39.05 2.44 0.00 0.00 3108438.24 124005.51 3238732.13 00:33:03.221 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:03.221 Verification LBA range: start 0x0 length 0x100 00:33:03.221 crypto_ram2 : 5.85 43.79 2.74 0.00 0.00 2729211.10 73856.22 2553054.61 00:33:03.221 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:03.221 Verification LBA range: start 0x100 length 0x100 00:33:03.221 crypto_ram2 : 6.09 41.41 2.59 0.00 0.00 2864669.47 116711.07 3238732.13 00:33:03.221 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:03.221 Verification LBA range: start 0x0 length 0x100 00:33:03.221 crypto_ram3 : 5.61 278.66 17.42 0.00 0.00 407431.37 4644.51 561672.01 00:33:03.221 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:03.221 Verification LBA range: start 0x100 length 0x100 00:33:03.221 crypto_ram3 : 5.74 221.81 13.86 0.00 0.00 504132.69 17666.23 616380.33 00:33:03.221 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:03.221 Verification LBA range: start 0x0 length 0x100 00:33:03.221 crypto_ram4 : 5.71 295.06 18.44 0.00 0.00 372518.11 2194.03 532494.25 00:33:03.221 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:03.221 Verification LBA range: start 0x100 length 0x100 00:33:03.221 crypto_ram4 : 5.87 238.70 14.92 0.00 0.00 453897.01 31685.23 576260.90 00:33:03.221 =================================================================================================================== 00:33:03.221 Total : 1202.29 75.14 0.00 0.00 783605.43 2194.03 3238732.13 00:33:03.479 00:33:03.479 real 0m9.342s 00:33:03.479 user 0m17.680s 00:33:03.479 sys 0m0.479s 00:33:03.479 00:27:50 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:03.479 00:27:50 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:03.479 ************************************ 00:33:03.479 END TEST bdev_verify_big_io 00:33:03.479 ************************************ 00:33:03.479 00:27:50 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:03.479 00:27:50 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:03.479 00:27:50 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:03.479 00:27:50 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:03.479 00:27:50 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:03.479 ************************************ 00:33:03.479 START TEST bdev_write_zeroes 00:33:03.479 ************************************ 00:33:03.479 00:27:50 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:03.479 [2024-07-16 00:27:50.421495] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:03.479 [2024-07-16 00:27:50.421575] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3683238 ] 00:33:03.736 [2024-07-16 00:27:50.567558] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:03.736 [2024-07-16 00:27:50.669879] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:03.995 [2024-07-16 00:27:50.691175] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:03.995 [2024-07-16 00:27:50.699217] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:03.995 [2024-07-16 00:27:50.707222] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:03.995 [2024-07-16 00:27:50.811772] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:06.521 [2024-07-16 00:27:53.048069] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:06.521 [2024-07-16 00:27:53.048133] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:06.521 [2024-07-16 00:27:53.048157] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:06.521 [2024-07-16 00:27:53.056092] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:06.521 [2024-07-16 00:27:53.056123] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:06.521 [2024-07-16 00:27:53.056142] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:06.521 [2024-07-16 00:27:53.064109] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:06.521 [2024-07-16 00:27:53.064133] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:06.521 [2024-07-16 00:27:53.064151] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:06.521 [2024-07-16 00:27:53.072128] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:06.521 [2024-07-16 00:27:53.072159] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:06.521 [2024-07-16 00:27:53.072177] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:06.521 Running I/O for 1 seconds... 00:33:07.455 00:33:07.455 Latency(us) 00:33:07.455 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:07.455 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:07.455 crypto_ram : 1.03 1956.39 7.64 0.00 0.00 64865.83 5442.34 77959.35 00:33:07.455 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:07.455 crypto_ram2 : 1.03 1962.13 7.66 0.00 0.00 64326.65 5413.84 72488.51 00:33:07.455 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:07.455 crypto_ram3 : 1.02 15043.69 58.76 0.00 0.00 8374.41 2478.97 10827.69 00:33:07.455 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:07.455 crypto_ram4 : 1.02 15080.95 58.91 0.00 0.00 8328.09 2478.97 8719.14 00:33:07.455 =================================================================================================================== 00:33:07.455 Total : 34043.16 132.98 0.00 0.00 14853.14 2478.97 77959.35 00:33:07.713 00:33:07.713 real 0m4.262s 00:33:07.713 user 0m3.817s 00:33:07.713 sys 0m0.395s 00:33:07.713 00:27:54 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:07.713 00:27:54 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:07.713 ************************************ 00:33:07.713 END TEST bdev_write_zeroes 00:33:07.713 ************************************ 00:33:07.713 00:27:54 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:07.975 00:27:54 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:07.975 00:27:54 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:07.975 00:27:54 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:07.975 00:27:54 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:07.975 ************************************ 00:33:07.975 START TEST bdev_json_nonenclosed 00:33:07.975 ************************************ 00:33:07.975 00:27:54 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:07.975 [2024-07-16 00:27:54.771674] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:07.975 [2024-07-16 00:27:54.771734] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3683780 ] 00:33:07.975 [2024-07-16 00:27:54.901733] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:08.233 [2024-07-16 00:27:55.003263] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:08.233 [2024-07-16 00:27:55.003340] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:08.233 [2024-07-16 00:27:55.003362] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:08.233 [2024-07-16 00:27:55.003376] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:08.233 00:33:08.233 real 0m0.396s 00:33:08.233 user 0m0.229s 00:33:08.233 sys 0m0.164s 00:33:08.233 00:27:55 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:08.233 00:27:55 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:08.233 00:27:55 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:08.233 ************************************ 00:33:08.233 END TEST bdev_json_nonenclosed 00:33:08.233 ************************************ 00:33:08.233 00:27:55 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:08.233 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:33:08.233 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:08.233 00:27:55 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:08.233 00:27:55 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:08.233 00:27:55 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:08.491 ************************************ 00:33:08.491 START TEST bdev_json_nonarray 00:33:08.491 ************************************ 00:33:08.491 00:27:55 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:08.491 [2024-07-16 00:27:55.255949] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:08.491 [2024-07-16 00:27:55.256010] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3683811 ] 00:33:08.491 [2024-07-16 00:27:55.384442] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:08.749 [2024-07-16 00:27:55.483703] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:08.749 [2024-07-16 00:27:55.483776] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:08.749 [2024-07-16 00:27:55.483798] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:08.749 [2024-07-16 00:27:55.483810] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:08.749 00:33:08.749 real 0m0.388s 00:33:08.749 user 0m0.246s 00:33:08.749 sys 0m0.139s 00:33:08.749 00:27:55 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:08.749 00:27:55 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:08.749 00:27:55 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:08.749 ************************************ 00:33:08.749 END TEST bdev_json_nonarray 00:33:08.749 ************************************ 00:33:08.749 00:27:55 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:08.749 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:33:08.749 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:33:08.749 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:33:08.749 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:33:08.749 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:08.749 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:33:08.749 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:08.750 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:08.750 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:33:08.750 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:33:08.750 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:33:08.750 00:27:55 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:33:08.750 00:33:08.750 real 1m13.460s 00:33:08.750 user 2m43.137s 00:33:08.750 sys 0m9.424s 00:33:08.750 00:27:55 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:08.750 00:27:55 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:08.750 ************************************ 00:33:08.750 END TEST blockdev_crypto_aesni 00:33:08.750 ************************************ 00:33:08.750 00:27:55 -- common/autotest_common.sh@1142 -- # return 0 00:33:08.750 00:27:55 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:08.750 00:27:55 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:08.750 00:27:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:08.750 00:27:55 -- common/autotest_common.sh@10 -- # set +x 00:33:09.008 ************************************ 00:33:09.008 START TEST blockdev_crypto_sw 00:33:09.008 ************************************ 00:33:09.008 00:27:55 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:09.008 * Looking for test storage... 00:33:09.008 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3684035 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 3684035 00:33:09.008 00:27:55 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:09.008 00:27:55 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 3684035 ']' 00:33:09.008 00:27:55 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:09.008 00:27:55 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:09.008 00:27:55 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:09.008 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:09.008 00:27:55 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:09.008 00:27:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:09.008 [2024-07-16 00:27:55.904859] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:09.008 [2024-07-16 00:27:55.904917] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3684035 ] 00:33:09.267 [2024-07-16 00:27:56.017404] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:09.267 [2024-07-16 00:27:56.119945] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:10.201 00:27:56 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:10.201 00:27:56 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:33:10.201 00:27:56 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:33:10.201 00:27:56 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:33:10.201 00:27:56 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:33:10.201 00:27:56 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.201 00:27:56 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:10.201 Malloc0 00:33:10.201 Malloc1 00:33:10.201 true 00:33:10.201 true 00:33:10.201 true 00:33:10.201 [2024-07-16 00:27:57.116848] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:10.201 crypto_ram 00:33:10.201 [2024-07-16 00:27:57.124881] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:10.201 crypto_ram2 00:33:10.201 [2024-07-16 00:27:57.132904] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:10.201 crypto_ram3 00:33:10.201 [ 00:33:10.201 { 00:33:10.201 "name": "Malloc1", 00:33:10.201 "aliases": [ 00:33:10.201 "7ab9a03f-5193-431c-b454-174e94569ede" 00:33:10.201 ], 00:33:10.201 "product_name": "Malloc disk", 00:33:10.201 "block_size": 4096, 00:33:10.201 "num_blocks": 4096, 00:33:10.201 "uuid": "7ab9a03f-5193-431c-b454-174e94569ede", 00:33:10.201 "assigned_rate_limits": { 00:33:10.201 "rw_ios_per_sec": 0, 00:33:10.201 "rw_mbytes_per_sec": 0, 00:33:10.201 "r_mbytes_per_sec": 0, 00:33:10.201 "w_mbytes_per_sec": 0 00:33:10.201 }, 00:33:10.201 "claimed": true, 00:33:10.201 "claim_type": "exclusive_write", 00:33:10.201 "zoned": false, 00:33:10.201 "supported_io_types": { 00:33:10.201 "read": true, 00:33:10.201 "write": true, 00:33:10.201 "unmap": true, 00:33:10.201 "flush": true, 00:33:10.201 "reset": true, 00:33:10.459 "nvme_admin": false, 00:33:10.459 "nvme_io": false, 00:33:10.459 "nvme_io_md": false, 00:33:10.459 "write_zeroes": true, 00:33:10.459 "zcopy": true, 00:33:10.459 "get_zone_info": false, 00:33:10.459 "zone_management": false, 00:33:10.459 "zone_append": false, 00:33:10.459 "compare": false, 00:33:10.459 "compare_and_write": false, 00:33:10.459 "abort": true, 00:33:10.459 "seek_hole": false, 00:33:10.459 "seek_data": false, 00:33:10.459 "copy": true, 00:33:10.459 "nvme_iov_md": false 00:33:10.459 }, 00:33:10.459 "memory_domains": [ 00:33:10.459 { 00:33:10.459 "dma_device_id": "system", 00:33:10.459 "dma_device_type": 1 00:33:10.459 }, 00:33:10.459 { 00:33:10.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:10.459 "dma_device_type": 2 00:33:10.459 } 00:33:10.459 ], 00:33:10.459 "driver_specific": {} 00:33:10.459 } 00:33:10.459 ] 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "63f4c5e1-646f-581d-bc7d-718f87fccc57"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "63f4c5e1-646f-581d-bc7d-718f87fccc57",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "938e767e-f13e-5ed8-9745-ff14711f31e5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "938e767e-f13e-5ed8-9745-ff14711f31e5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:33:10.459 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 3684035 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 3684035 ']' 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 3684035 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:10.459 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3684035 00:33:10.718 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:10.718 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:10.718 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3684035' 00:33:10.718 killing process with pid 3684035 00:33:10.718 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 3684035 00:33:10.718 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 3684035 00:33:11.032 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:11.032 00:27:57 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:11.032 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:11.032 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:11.032 00:27:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:11.032 ************************************ 00:33:11.032 START TEST bdev_hello_world 00:33:11.032 ************************************ 00:33:11.032 00:27:57 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:11.032 [2024-07-16 00:27:57.907737] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:11.032 [2024-07-16 00:27:57.907797] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3684241 ] 00:33:11.291 [2024-07-16 00:27:58.039057] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:11.291 [2024-07-16 00:27:58.135445] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:11.550 [2024-07-16 00:27:58.318866] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:11.550 [2024-07-16 00:27:58.318944] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:11.550 [2024-07-16 00:27:58.318961] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:11.550 [2024-07-16 00:27:58.326885] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:11.550 [2024-07-16 00:27:58.326904] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:11.550 [2024-07-16 00:27:58.326916] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:11.550 [2024-07-16 00:27:58.334905] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:11.550 [2024-07-16 00:27:58.334923] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:11.550 [2024-07-16 00:27:58.334941] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:11.550 [2024-07-16 00:27:58.375351] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:11.550 [2024-07-16 00:27:58.375387] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:11.550 [2024-07-16 00:27:58.375406] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:11.550 [2024-07-16 00:27:58.377389] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:11.550 [2024-07-16 00:27:58.377457] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:11.550 [2024-07-16 00:27:58.377472] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:11.550 [2024-07-16 00:27:58.377506] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:11.550 00:33:11.550 [2024-07-16 00:27:58.377524] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:11.809 00:33:11.809 real 0m0.732s 00:33:11.809 user 0m0.481s 00:33:11.809 sys 0m0.225s 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:11.809 ************************************ 00:33:11.809 END TEST bdev_hello_world 00:33:11.809 ************************************ 00:33:11.809 00:27:58 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:11.809 00:27:58 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:33:11.809 00:27:58 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:11.809 00:27:58 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:11.809 00:27:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:11.809 ************************************ 00:33:11.809 START TEST bdev_bounds 00:33:11.809 ************************************ 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3684428 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3684428' 00:33:11.809 Process bdevio pid: 3684428 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3684428 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 3684428 ']' 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:11.809 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:11.809 00:27:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:11.809 [2024-07-16 00:27:58.723264] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:11.809 [2024-07-16 00:27:58.723331] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3684428 ] 00:33:12.068 [2024-07-16 00:27:58.851907] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:12.068 [2024-07-16 00:27:58.961221] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:12.068 [2024-07-16 00:27:58.961322] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:12.068 [2024-07-16 00:27:58.961325] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:12.327 [2024-07-16 00:27:59.134934] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:12.327 [2024-07-16 00:27:59.135004] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:12.327 [2024-07-16 00:27:59.135019] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:12.327 [2024-07-16 00:27:59.142952] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:12.327 [2024-07-16 00:27:59.142971] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:12.327 [2024-07-16 00:27:59.142983] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:12.327 [2024-07-16 00:27:59.150972] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:12.327 [2024-07-16 00:27:59.150989] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:12.327 [2024-07-16 00:27:59.151001] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:12.894 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:12.894 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:12.894 00:27:59 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:12.894 I/O targets: 00:33:12.894 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:33:12.894 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:33:12.894 00:33:12.894 00:33:12.894 CUnit - A unit testing framework for C - Version 2.1-3 00:33:12.894 http://cunit.sourceforge.net/ 00:33:12.894 00:33:12.894 00:33:12.894 Suite: bdevio tests on: crypto_ram3 00:33:12.894 Test: blockdev write read block ...passed 00:33:12.894 Test: blockdev write zeroes read block ...passed 00:33:12.894 Test: blockdev write zeroes read no split ...passed 00:33:12.894 Test: blockdev write zeroes read split ...passed 00:33:12.894 Test: blockdev write zeroes read split partial ...passed 00:33:12.894 Test: blockdev reset ...passed 00:33:12.894 Test: blockdev write read 8 blocks ...passed 00:33:12.894 Test: blockdev write read size > 128k ...passed 00:33:12.894 Test: blockdev write read invalid size ...passed 00:33:12.894 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:12.894 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:12.894 Test: blockdev write read max offset ...passed 00:33:12.894 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:12.894 Test: blockdev writev readv 8 blocks ...passed 00:33:12.894 Test: blockdev writev readv 30 x 1block ...passed 00:33:12.894 Test: blockdev writev readv block ...passed 00:33:12.894 Test: blockdev writev readv size > 128k ...passed 00:33:12.894 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:12.894 Test: blockdev comparev and writev ...passed 00:33:12.894 Test: blockdev nvme passthru rw ...passed 00:33:12.894 Test: blockdev nvme passthru vendor specific ...passed 00:33:12.894 Test: blockdev nvme admin passthru ...passed 00:33:12.894 Test: blockdev copy ...passed 00:33:12.894 Suite: bdevio tests on: crypto_ram 00:33:12.894 Test: blockdev write read block ...passed 00:33:12.894 Test: blockdev write zeroes read block ...passed 00:33:12.894 Test: blockdev write zeroes read no split ...passed 00:33:12.894 Test: blockdev write zeroes read split ...passed 00:33:12.894 Test: blockdev write zeroes read split partial ...passed 00:33:12.894 Test: blockdev reset ...passed 00:33:12.894 Test: blockdev write read 8 blocks ...passed 00:33:12.894 Test: blockdev write read size > 128k ...passed 00:33:12.894 Test: blockdev write read invalid size ...passed 00:33:12.894 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:12.894 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:12.894 Test: blockdev write read max offset ...passed 00:33:12.894 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:12.894 Test: blockdev writev readv 8 blocks ...passed 00:33:12.894 Test: blockdev writev readv 30 x 1block ...passed 00:33:12.894 Test: blockdev writev readv block ...passed 00:33:12.894 Test: blockdev writev readv size > 128k ...passed 00:33:12.894 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:12.894 Test: blockdev comparev and writev ...passed 00:33:12.894 Test: blockdev nvme passthru rw ...passed 00:33:12.894 Test: blockdev nvme passthru vendor specific ...passed 00:33:12.894 Test: blockdev nvme admin passthru ...passed 00:33:12.894 Test: blockdev copy ...passed 00:33:12.894 00:33:12.894 Run Summary: Type Total Ran Passed Failed Inactive 00:33:12.894 suites 2 2 n/a 0 0 00:33:12.894 tests 46 46 46 0 0 00:33:12.894 asserts 260 260 260 0 n/a 00:33:12.894 00:33:12.894 Elapsed time = 0.197 seconds 00:33:12.894 0 00:33:12.894 00:27:59 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3684428 00:33:12.894 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 3684428 ']' 00:33:12.894 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 3684428 00:33:12.894 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:12.894 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:12.894 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3684428 00:33:13.153 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:13.153 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:13.153 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3684428' 00:33:13.153 killing process with pid 3684428 00:33:13.153 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 3684428 00:33:13.153 00:27:59 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 3684428 00:33:13.153 00:28:00 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:33:13.153 00:33:13.153 real 0m1.425s 00:33:13.153 user 0m3.528s 00:33:13.153 sys 0m0.412s 00:33:13.153 00:28:00 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:13.153 00:28:00 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:13.153 ************************************ 00:33:13.153 END TEST bdev_bounds 00:33:13.153 ************************************ 00:33:13.412 00:28:00 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:13.412 00:28:00 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:13.412 00:28:00 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:13.412 00:28:00 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:13.412 00:28:00 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:13.412 ************************************ 00:33:13.412 START TEST bdev_nbd 00:33:13.412 ************************************ 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3684636 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3684636 /var/tmp/spdk-nbd.sock 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 3684636 ']' 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:13.412 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:13.412 00:28:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:13.412 [2024-07-16 00:28:00.234768] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:13.412 [2024-07-16 00:28:00.234834] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:13.671 [2024-07-16 00:28:00.363004] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:13.671 [2024-07-16 00:28:00.464898] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:13.929 [2024-07-16 00:28:00.637315] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:13.929 [2024-07-16 00:28:00.637380] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:13.929 [2024-07-16 00:28:00.637395] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.929 [2024-07-16 00:28:00.645333] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:13.929 [2024-07-16 00:28:00.645352] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:13.929 [2024-07-16 00:28:00.645364] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.929 [2024-07-16 00:28:00.653354] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:13.929 [2024-07-16 00:28:00.653372] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:13.929 [2024-07-16 00:28:00.653384] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:14.497 1+0 records in 00:33:14.497 1+0 records out 00:33:14.497 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282468 s, 14.5 MB/s 00:33:14.497 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:14.756 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:14.756 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:14.756 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:14.756 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:14.756 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:14.756 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:14.756 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:15.015 1+0 records in 00:33:15.015 1+0 records out 00:33:15.015 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00031109 s, 13.2 MB/s 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:15.015 00:28:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:15.582 { 00:33:15.582 "nbd_device": "/dev/nbd0", 00:33:15.582 "bdev_name": "crypto_ram" 00:33:15.582 }, 00:33:15.582 { 00:33:15.582 "nbd_device": "/dev/nbd1", 00:33:15.582 "bdev_name": "crypto_ram3" 00:33:15.582 } 00:33:15.582 ]' 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:15.582 { 00:33:15.582 "nbd_device": "/dev/nbd0", 00:33:15.582 "bdev_name": "crypto_ram" 00:33:15.582 }, 00:33:15.582 { 00:33:15.582 "nbd_device": "/dev/nbd1", 00:33:15.582 "bdev_name": "crypto_ram3" 00:33:15.582 } 00:33:15.582 ]' 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:15.582 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:15.840 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:15.840 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:15.840 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:15.840 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:15.840 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:15.840 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:15.840 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:15.840 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:15.840 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:15.840 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:16.098 00:28:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:16.356 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:16.614 /dev/nbd0 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:16.614 1+0 records in 00:33:16.614 1+0 records out 00:33:16.614 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267702 s, 15.3 MB/s 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:16.614 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:33:16.873 /dev/nbd1 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:16.873 1+0 records in 00:33:16.873 1+0 records out 00:33:16.873 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000340539 s, 12.0 MB/s 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:16.873 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:17.131 00:28:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:17.131 { 00:33:17.131 "nbd_device": "/dev/nbd0", 00:33:17.131 "bdev_name": "crypto_ram" 00:33:17.131 }, 00:33:17.131 { 00:33:17.131 "nbd_device": "/dev/nbd1", 00:33:17.131 "bdev_name": "crypto_ram3" 00:33:17.131 } 00:33:17.131 ]' 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:17.131 { 00:33:17.131 "nbd_device": "/dev/nbd0", 00:33:17.131 "bdev_name": "crypto_ram" 00:33:17.131 }, 00:33:17.131 { 00:33:17.131 "nbd_device": "/dev/nbd1", 00:33:17.131 "bdev_name": "crypto_ram3" 00:33:17.131 } 00:33:17.131 ]' 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:17.131 /dev/nbd1' 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:17.131 /dev/nbd1' 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:17.131 256+0 records in 00:33:17.131 256+0 records out 00:33:17.131 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0100821 s, 104 MB/s 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:17.131 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:17.390 256+0 records in 00:33:17.390 256+0 records out 00:33:17.390 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0303918 s, 34.5 MB/s 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:17.390 256+0 records in 00:33:17.390 256+0 records out 00:33:17.390 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0446786 s, 23.5 MB/s 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:17.390 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:17.648 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:17.648 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:17.648 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:17.648 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:17.648 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:17.648 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:17.648 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:17.648 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:17.648 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:17.648 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:17.907 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:18.165 00:28:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:18.424 malloc_lvol_verify 00:33:18.424 00:28:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:18.990 acef9150-5d47-4fa8-8b6c-4fa4264a0b8e 00:33:18.990 00:28:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:19.248 9e051875-6707-440b-8bef-441c0e359a91 00:33:19.248 00:28:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:19.815 /dev/nbd0 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:19.815 mke2fs 1.46.5 (30-Dec-2021) 00:33:19.815 Discarding device blocks: 0/4096 done 00:33:19.815 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:19.815 00:33:19.815 Allocating group tables: 0/1 done 00:33:19.815 Writing inode tables: 0/1 done 00:33:19.815 Creating journal (1024 blocks): done 00:33:19.815 Writing superblocks and filesystem accounting information: 0/1 done 00:33:19.815 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:19.815 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3684636 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 3684636 ']' 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 3684636 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3684636 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3684636' 00:33:20.074 killing process with pid 3684636 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 3684636 00:33:20.074 00:28:06 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 3684636 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:20.334 00:33:20.334 real 0m6.869s 00:33:20.334 user 0m10.016s 00:33:20.334 sys 0m2.734s 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:20.334 ************************************ 00:33:20.334 END TEST bdev_nbd 00:33:20.334 ************************************ 00:33:20.334 00:28:07 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:20.334 00:28:07 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:20.334 00:28:07 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:33:20.334 00:28:07 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:33:20.334 00:28:07 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:20.334 00:28:07 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:20.334 00:28:07 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:20.334 00:28:07 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:20.334 ************************************ 00:33:20.334 START TEST bdev_fio 00:33:20.334 ************************************ 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:20.334 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:20.334 ************************************ 00:33:20.334 START TEST bdev_fio_rw_verify 00:33:20.334 ************************************ 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:20.334 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:20.604 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:20.604 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:20.604 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:20.604 00:28:07 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:20.862 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:20.862 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:20.862 fio-3.35 00:33:20.862 Starting 2 threads 00:33:33.076 00:33:33.076 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=3685768: Tue Jul 16 00:28:18 2024 00:33:33.076 read: IOPS=21.7k, BW=84.7MiB/s (88.8MB/s)(847MiB/10001msec) 00:33:33.076 slat (nsec): min=14339, max=72833, avg=20156.26, stdev=3584.58 00:33:33.076 clat (usec): min=7, max=1823, avg=147.05, stdev=59.46 00:33:33.076 lat (usec): min=25, max=1843, avg=167.20, stdev=60.87 00:33:33.076 clat percentiles (usec): 00:33:33.076 | 50.000th=[ 145], 99.000th=[ 281], 99.900th=[ 302], 99.990th=[ 359], 00:33:33.076 | 99.999th=[ 1795] 00:33:33.077 write: IOPS=26.0k, BW=102MiB/s (107MB/s)(964MiB/9481msec); 0 zone resets 00:33:33.077 slat (usec): min=14, max=271, avg=34.05, stdev= 4.41 00:33:33.077 clat (usec): min=25, max=894, avg=197.79, stdev=90.26 00:33:33.077 lat (usec): min=51, max=1008, avg=231.83, stdev=91.85 00:33:33.077 clat percentiles (usec): 00:33:33.077 | 50.000th=[ 192], 99.000th=[ 392], 99.900th=[ 412], 99.990th=[ 611], 00:33:33.077 | 99.999th=[ 848] 00:33:33.077 bw ( KiB/s): min=92256, max=105248, per=94.88%, avg=98771.79, stdev=1674.98, samples=38 00:33:33.077 iops : min=23064, max=26312, avg=24692.95, stdev=418.74, samples=38 00:33:33.077 lat (usec) : 10=0.01%, 20=0.01%, 50=4.26%, 100=14.84%, 250=62.76% 00:33:33.077 lat (usec) : 500=18.11%, 750=0.01%, 1000=0.01% 00:33:33.077 lat (msec) : 2=0.01% 00:33:33.077 cpu : usr=99.60%, sys=0.00%, ctx=19, majf=0, minf=457 00:33:33.077 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:33.077 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:33.077 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:33.077 issued rwts: total=216808,246739,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:33.077 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:33.077 00:33:33.077 Run status group 0 (all jobs): 00:33:33.077 READ: bw=84.7MiB/s (88.8MB/s), 84.7MiB/s-84.7MiB/s (88.8MB/s-88.8MB/s), io=847MiB (888MB), run=10001-10001msec 00:33:33.077 WRITE: bw=102MiB/s (107MB/s), 102MiB/s-102MiB/s (107MB/s-107MB/s), io=964MiB (1011MB), run=9481-9481msec 00:33:33.077 00:33:33.077 real 0m11.161s 00:33:33.077 user 0m23.765s 00:33:33.077 sys 0m0.360s 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:33.077 ************************************ 00:33:33.077 END TEST bdev_fio_rw_verify 00:33:33.077 ************************************ 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "63f4c5e1-646f-581d-bc7d-718f87fccc57"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "63f4c5e1-646f-581d-bc7d-718f87fccc57",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "938e767e-f13e-5ed8-9745-ff14711f31e5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "938e767e-f13e-5ed8-9745-ff14711f31e5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:33.077 crypto_ram3 ]] 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "63f4c5e1-646f-581d-bc7d-718f87fccc57"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "63f4c5e1-646f-581d-bc7d-718f87fccc57",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "938e767e-f13e-5ed8-9745-ff14711f31e5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "938e767e-f13e-5ed8-9745-ff14711f31e5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:33.077 ************************************ 00:33:33.077 START TEST bdev_fio_trim 00:33:33.077 ************************************ 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:33.077 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:33.078 00:28:18 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:33.078 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:33.078 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:33.078 fio-3.35 00:33:33.078 Starting 2 threads 00:33:43.088 00:33:43.088 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=3687349: Tue Jul 16 00:28:29 2024 00:33:43.088 write: IOPS=38.7k, BW=151MiB/s (159MB/s)(1513MiB/10001msec); 0 zone resets 00:33:43.088 slat (usec): min=14, max=482, avg=22.65, stdev= 4.37 00:33:43.088 clat (usec): min=37, max=2050, avg=169.75, stdev=93.45 00:33:43.088 lat (usec): min=52, max=2077, avg=192.40, stdev=96.79 00:33:43.088 clat percentiles (usec): 00:33:43.088 | 50.000th=[ 137], 99.000th=[ 351], 99.900th=[ 371], 99.990th=[ 570], 00:33:43.088 | 99.999th=[ 1975] 00:33:43.088 bw ( KiB/s): min=151112, max=156008, per=100.00%, avg=155061.89, stdev=528.35, samples=38 00:33:43.089 iops : min=37778, max=39002, avg=38765.58, stdev=132.11, samples=38 00:33:43.089 trim: IOPS=38.7k, BW=151MiB/s (159MB/s)(1513MiB/10001msec); 0 zone resets 00:33:43.089 slat (usec): min=5, max=1681, avg=10.26, stdev= 3.52 00:33:43.089 clat (usec): min=48, max=1842, avg=113.28, stdev=34.30 00:33:43.089 lat (usec): min=57, max=1853, avg=123.54, stdev=34.55 00:33:43.089 clat percentiles (usec): 00:33:43.089 | 50.000th=[ 115], 99.000th=[ 186], 99.900th=[ 198], 99.990th=[ 343], 00:33:43.089 | 99.999th=[ 603] 00:33:43.089 bw ( KiB/s): min=151136, max=156008, per=100.00%, avg=155063.16, stdev=526.47, samples=38 00:33:43.089 iops : min=37784, max=39002, avg=38765.89, stdev=131.65, samples=38 00:33:43.089 lat (usec) : 50=3.34%, 100=31.83%, 250=50.64%, 500=14.19%, 750=0.01% 00:33:43.089 lat (usec) : 1000=0.01% 00:33:43.089 lat (msec) : 2=0.01%, 4=0.01% 00:33:43.089 cpu : usr=99.61%, sys=0.00%, ctx=21, majf=0, minf=335 00:33:43.089 IO depths : 1=7.4%, 2=17.4%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:43.089 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:43.089 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:43.089 issued rwts: total=0,387439,387440,0 short=0,0,0,0 dropped=0,0,0,0 00:33:43.089 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:43.089 00:33:43.089 Run status group 0 (all jobs): 00:33:43.089 WRITE: bw=151MiB/s (159MB/s), 151MiB/s-151MiB/s (159MB/s-159MB/s), io=1513MiB (1587MB), run=10001-10001msec 00:33:43.089 TRIM: bw=151MiB/s (159MB/s), 151MiB/s-151MiB/s (159MB/s-159MB/s), io=1513MiB (1587MB), run=10001-10001msec 00:33:43.089 00:33:43.089 real 0m11.106s 00:33:43.089 user 0m23.606s 00:33:43.089 sys 0m0.351s 00:33:43.089 00:28:29 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:43.089 00:28:29 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:43.089 ************************************ 00:33:43.089 END TEST bdev_fio_trim 00:33:43.089 ************************************ 00:33:43.089 00:28:29 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:43.089 00:28:29 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:43.089 00:28:29 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:43.089 00:28:29 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:43.089 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:43.089 00:28:29 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:43.089 00:33:43.089 real 0m22.630s 00:33:43.089 user 0m47.566s 00:33:43.089 sys 0m0.903s 00:33:43.089 00:28:29 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:43.089 00:28:29 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:43.089 ************************************ 00:33:43.089 END TEST bdev_fio 00:33:43.089 ************************************ 00:33:43.089 00:28:29 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:43.089 00:28:29 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:43.089 00:28:29 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:43.089 00:28:29 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:43.089 00:28:29 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:43.089 00:28:29 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:43.089 ************************************ 00:33:43.089 START TEST bdev_verify 00:33:43.089 ************************************ 00:33:43.089 00:28:29 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:43.089 [2024-07-16 00:28:29.884132] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:43.089 [2024-07-16 00:28:29.884196] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3688678 ] 00:33:43.089 [2024-07-16 00:28:30.013568] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:43.348 [2024-07-16 00:28:30.119613] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:43.348 [2024-07-16 00:28:30.119618] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:43.348 [2024-07-16 00:28:30.297272] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:43.348 [2024-07-16 00:28:30.297346] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:43.348 [2024-07-16 00:28:30.297361] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:43.607 [2024-07-16 00:28:30.305293] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:43.607 [2024-07-16 00:28:30.305313] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:43.607 [2024-07-16 00:28:30.305325] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:43.607 [2024-07-16 00:28:30.313317] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:43.607 [2024-07-16 00:28:30.313337] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:43.607 [2024-07-16 00:28:30.313349] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:43.607 Running I/O for 5 seconds... 00:33:48.881 00:33:48.881 Latency(us) 00:33:48.881 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:48.881 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:48.881 Verification LBA range: start 0x0 length 0x800 00:33:48.881 crypto_ram : 5.01 5905.03 23.07 0.00 0.00 21596.20 1681.14 23820.91 00:33:48.881 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:48.881 Verification LBA range: start 0x800 length 0x800 00:33:48.881 crypto_ram : 5.03 4814.31 18.81 0.00 0.00 26477.83 2478.97 29861.62 00:33:48.881 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:48.881 Verification LBA range: start 0x0 length 0x800 00:33:48.881 crypto_ram3 : 5.03 2977.17 11.63 0.00 0.00 42762.05 2165.54 29177.77 00:33:48.881 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:48.881 Verification LBA range: start 0x800 length 0x800 00:33:48.881 crypto_ram3 : 5.03 2415.68 9.44 0.00 0.00 52668.26 2208.28 34648.60 00:33:48.881 =================================================================================================================== 00:33:48.881 Total : 16112.19 62.94 0.00 0.00 31645.06 1681.14 34648.60 00:33:48.881 00:33:48.881 real 0m5.814s 00:33:48.881 user 0m10.910s 00:33:48.881 sys 0m0.245s 00:33:48.881 00:28:35 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:48.881 00:28:35 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:48.881 ************************************ 00:33:48.881 END TEST bdev_verify 00:33:48.881 ************************************ 00:33:48.881 00:28:35 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:48.881 00:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:48.881 00:28:35 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:48.881 00:28:35 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:48.881 00:28:35 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:48.881 ************************************ 00:33:48.881 START TEST bdev_verify_big_io 00:33:48.881 ************************************ 00:33:48.881 00:28:35 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:48.881 [2024-07-16 00:28:35.780138] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:48.881 [2024-07-16 00:28:35.780200] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3689399 ] 00:33:49.140 [2024-07-16 00:28:35.906182] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:49.140 [2024-07-16 00:28:36.004256] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:49.140 [2024-07-16 00:28:36.004261] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:49.399 [2024-07-16 00:28:36.176410] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:49.399 [2024-07-16 00:28:36.176483] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:49.399 [2024-07-16 00:28:36.176498] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:49.399 [2024-07-16 00:28:36.184429] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:49.399 [2024-07-16 00:28:36.184448] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:49.399 [2024-07-16 00:28:36.184460] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:49.399 [2024-07-16 00:28:36.192452] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:49.399 [2024-07-16 00:28:36.192470] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:49.399 [2024-07-16 00:28:36.192482] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:49.399 Running I/O for 5 seconds... 00:33:54.671 00:33:54.671 Latency(us) 00:33:54.671 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:54.671 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:54.671 Verification LBA range: start 0x0 length 0x80 00:33:54.671 crypto_ram : 5.14 422.96 26.44 0.00 0.00 294843.36 6211.67 403017.91 00:33:54.671 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:54.671 Verification LBA range: start 0x80 length 0x80 00:33:54.671 crypto_ram : 5.25 341.64 21.35 0.00 0.00 363355.32 7294.44 496022.04 00:33:54.671 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:54.671 Verification LBA range: start 0x0 length 0x80 00:33:54.671 crypto_ram3 : 5.29 241.95 15.12 0.00 0.00 497426.22 5670.29 435842.89 00:33:54.671 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:54.671 Verification LBA range: start 0x80 length 0x80 00:33:54.671 crypto_ram3 : 5.35 191.32 11.96 0.00 0.00 618340.81 7123.48 514258.14 00:33:54.671 =================================================================================================================== 00:33:54.672 Total : 1197.87 74.87 0.00 0.00 408577.56 5670.29 514258.14 00:33:54.930 00:33:54.930 real 0m6.141s 00:33:54.930 user 0m11.579s 00:33:54.930 sys 0m0.236s 00:33:54.930 00:28:41 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:54.930 00:28:41 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:54.930 ************************************ 00:33:54.930 END TEST bdev_verify_big_io 00:33:54.930 ************************************ 00:33:55.188 00:28:41 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:55.188 00:28:41 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:55.188 00:28:41 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:55.188 00:28:41 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:55.188 00:28:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:55.188 ************************************ 00:33:55.188 START TEST bdev_write_zeroes 00:33:55.188 ************************************ 00:33:55.188 00:28:41 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:55.188 [2024-07-16 00:28:42.014168] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:55.188 [2024-07-16 00:28:42.014231] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3690274 ] 00:33:55.446 [2024-07-16 00:28:42.146075] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:55.446 [2024-07-16 00:28:42.243487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:55.703 [2024-07-16 00:28:42.413580] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:55.703 [2024-07-16 00:28:42.413651] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:55.703 [2024-07-16 00:28:42.413666] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:55.703 [2024-07-16 00:28:42.421599] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:55.703 [2024-07-16 00:28:42.421618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:55.703 [2024-07-16 00:28:42.421630] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:55.703 [2024-07-16 00:28:42.429620] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:55.703 [2024-07-16 00:28:42.429638] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:55.703 [2024-07-16 00:28:42.429650] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:55.703 Running I/O for 1 seconds... 00:33:56.640 00:33:56.640 Latency(us) 00:33:56.640 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:56.640 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:56.640 crypto_ram : 1.01 26402.33 103.13 0.00 0.00 4836.58 1289.35 6496.61 00:33:56.640 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:56.640 crypto_ram3 : 1.01 13174.45 51.46 0.00 0.00 9644.71 6097.70 9801.91 00:33:56.640 =================================================================================================================== 00:33:56.640 Total : 39576.77 154.60 0.00 0.00 6439.29 1289.35 9801.91 00:33:56.899 00:33:56.899 real 0m1.771s 00:33:56.899 user 0m1.544s 00:33:56.899 sys 0m0.208s 00:33:56.899 00:28:43 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:56.899 00:28:43 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:56.899 ************************************ 00:33:56.899 END TEST bdev_write_zeroes 00:33:56.899 ************************************ 00:33:56.899 00:28:43 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:56.899 00:28:43 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:56.899 00:28:43 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:56.899 00:28:43 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:56.899 00:28:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:56.899 ************************************ 00:33:56.899 START TEST bdev_json_nonenclosed 00:33:56.899 ************************************ 00:33:56.899 00:28:43 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:57.158 [2024-07-16 00:28:43.868174] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:57.158 [2024-07-16 00:28:43.868236] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3690475 ] 00:33:57.158 [2024-07-16 00:28:43.994916] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:57.158 [2024-07-16 00:28:44.095222] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:57.158 [2024-07-16 00:28:44.095292] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:57.158 [2024-07-16 00:28:44.095313] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:57.158 [2024-07-16 00:28:44.095326] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:57.418 00:33:57.418 real 0m0.395s 00:33:57.418 user 0m0.246s 00:33:57.418 sys 0m0.147s 00:33:57.418 00:28:44 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:57.418 00:28:44 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:57.418 00:28:44 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:57.418 ************************************ 00:33:57.418 END TEST bdev_json_nonenclosed 00:33:57.418 ************************************ 00:33:57.418 00:28:44 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:57.418 00:28:44 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:33:57.418 00:28:44 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:57.418 00:28:44 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:57.418 00:28:44 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:57.418 00:28:44 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:57.418 ************************************ 00:33:57.418 START TEST bdev_json_nonarray 00:33:57.418 ************************************ 00:33:57.418 00:28:44 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:57.418 [2024-07-16 00:28:44.344163] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:57.418 [2024-07-16 00:28:44.344225] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3690505 ] 00:33:57.677 [2024-07-16 00:28:44.475211] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:57.677 [2024-07-16 00:28:44.572487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:57.677 [2024-07-16 00:28:44.572561] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:57.677 [2024-07-16 00:28:44.572583] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:57.677 [2024-07-16 00:28:44.572596] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:57.936 00:33:57.936 real 0m0.392s 00:33:57.936 user 0m0.230s 00:33:57.936 sys 0m0.158s 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:57.936 ************************************ 00:33:57.936 END TEST bdev_json_nonarray 00:33:57.936 ************************************ 00:33:57.936 00:28:44 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:57.936 00:28:44 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:33:57.936 00:28:44 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:33:57.936 00:28:44 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:33:57.936 00:28:44 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:33:57.936 00:28:44 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:33:57.936 00:28:44 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:33:57.936 00:28:44 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:57.936 00:28:44 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:57.936 ************************************ 00:33:57.936 START TEST bdev_crypto_enomem 00:33:57.936 ************************************ 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=3690688 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 3690688 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 3690688 ']' 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:57.936 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:57.936 00:28:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:57.936 [2024-07-16 00:28:44.818549] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:33:57.936 [2024-07-16 00:28:44.818615] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3690688 ] 00:33:58.196 [2024-07-16 00:28:44.952074] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:58.196 [2024-07-16 00:28:45.069455] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:59.132 true 00:33:59.132 base0 00:33:59.132 true 00:33:59.132 [2024-07-16 00:28:46.055753] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:59.132 crypt0 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:59.132 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:59.132 [ 00:33:59.132 { 00:33:59.132 "name": "crypt0", 00:33:59.132 "aliases": [ 00:33:59.132 "df581526-0fb0-5ff0-8d84-a8348572fbf5" 00:33:59.132 ], 00:33:59.132 "product_name": "crypto", 00:33:59.132 "block_size": 512, 00:33:59.132 "num_blocks": 2097152, 00:33:59.132 "uuid": "df581526-0fb0-5ff0-8d84-a8348572fbf5", 00:33:59.391 "assigned_rate_limits": { 00:33:59.391 "rw_ios_per_sec": 0, 00:33:59.391 "rw_mbytes_per_sec": 0, 00:33:59.391 "r_mbytes_per_sec": 0, 00:33:59.391 "w_mbytes_per_sec": 0 00:33:59.391 }, 00:33:59.391 "claimed": false, 00:33:59.391 "zoned": false, 00:33:59.391 "supported_io_types": { 00:33:59.391 "read": true, 00:33:59.391 "write": true, 00:33:59.391 "unmap": false, 00:33:59.391 "flush": false, 00:33:59.391 "reset": true, 00:33:59.391 "nvme_admin": false, 00:33:59.391 "nvme_io": false, 00:33:59.391 "nvme_io_md": false, 00:33:59.391 "write_zeroes": true, 00:33:59.391 "zcopy": false, 00:33:59.391 "get_zone_info": false, 00:33:59.391 "zone_management": false, 00:33:59.391 "zone_append": false, 00:33:59.391 "compare": false, 00:33:59.391 "compare_and_write": false, 00:33:59.391 "abort": false, 00:33:59.391 "seek_hole": false, 00:33:59.391 "seek_data": false, 00:33:59.391 "copy": false, 00:33:59.391 "nvme_iov_md": false 00:33:59.391 }, 00:33:59.391 "memory_domains": [ 00:33:59.391 { 00:33:59.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:59.391 "dma_device_type": 2 00:33:59.391 } 00:33:59.391 ], 00:33:59.391 "driver_specific": { 00:33:59.391 "crypto": { 00:33:59.391 "base_bdev_name": "EE_base0", 00:33:59.391 "name": "crypt0", 00:33:59.391 "key_name": "test_dek_sw" 00:33:59.391 } 00:33:59.391 } 00:33:59.391 } 00:33:59.391 ] 00:33:59.391 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:59.391 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:33:59.391 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=3690853 00:33:59.391 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:33:59.391 00:28:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:59.391 Running I/O for 5 seconds... 00:34:00.327 00:28:47 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:34:00.327 00:28:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:00.327 00:28:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:00.327 00:28:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:00.327 00:28:47 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 3690853 00:34:04.580 00:34:04.580 Latency(us) 00:34:04.580 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:04.580 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:34:04.580 crypt0 : 5.00 28039.52 109.53 0.00 0.00 1136.19 527.14 2364.99 00:34:04.580 =================================================================================================================== 00:34:04.580 Total : 28039.52 109.53 0.00 0.00 1136.19 527.14 2364.99 00:34:04.580 0 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 3690688 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 3690688 ']' 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 3690688 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3690688 00:34:04.580 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:04.581 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:04.581 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3690688' 00:34:04.581 killing process with pid 3690688 00:34:04.581 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 3690688 00:34:04.581 Received shutdown signal, test time was about 5.000000 seconds 00:34:04.581 00:34:04.581 Latency(us) 00:34:04.581 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:04.581 =================================================================================================================== 00:34:04.581 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:04.581 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 3690688 00:34:04.840 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:34:04.840 00:34:04.840 real 0m6.913s 00:34:04.840 user 0m7.468s 00:34:04.840 sys 0m0.466s 00:34:04.840 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:04.840 00:28:51 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:04.840 ************************************ 00:34:04.840 END TEST bdev_crypto_enomem 00:34:04.840 ************************************ 00:34:04.840 00:28:51 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:04.840 00:28:51 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:34:04.840 00:28:51 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:34:04.840 00:28:51 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:04.840 00:28:51 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:04.840 00:28:51 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:34:04.840 00:28:51 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:34:04.840 00:28:51 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:34:04.840 00:28:51 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:34:04.840 00:34:04.840 real 0m56.005s 00:34:04.840 user 1m36.107s 00:34:04.840 sys 0m6.975s 00:34:04.840 00:28:51 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:04.840 00:28:51 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:04.840 ************************************ 00:34:04.840 END TEST blockdev_crypto_sw 00:34:04.840 ************************************ 00:34:04.840 00:28:51 -- common/autotest_common.sh@1142 -- # return 0 00:34:04.840 00:28:51 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:04.840 00:28:51 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:04.840 00:28:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:04.840 00:28:51 -- common/autotest_common.sh@10 -- # set +x 00:34:05.098 ************************************ 00:34:05.098 START TEST blockdev_crypto_qat 00:34:05.098 ************************************ 00:34:05.098 00:28:51 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:05.098 * Looking for test storage... 00:34:05.098 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:34:05.098 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:34:05.099 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3691631 00:34:05.099 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:05.099 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:05.099 00:28:51 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 3691631 00:34:05.099 00:28:51 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 3691631 ']' 00:34:05.099 00:28:51 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:05.099 00:28:51 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:05.099 00:28:51 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:05.099 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:05.099 00:28:51 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:05.099 00:28:51 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:05.099 [2024-07-16 00:28:52.002563] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:34:05.099 [2024-07-16 00:28:52.002624] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3691631 ] 00:34:05.356 [2024-07-16 00:28:52.116373] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:05.356 [2024-07-16 00:28:52.222421] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:05.922 00:28:52 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:05.923 00:28:52 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:34:05.923 00:28:52 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:34:05.923 00:28:52 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:34:05.923 00:28:52 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:34:05.923 00:28:52 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:05.923 00:28:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:06.181 [2024-07-16 00:28:52.884533] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:06.181 [2024-07-16 00:28:52.892569] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:06.181 [2024-07-16 00:28:52.900583] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:06.181 [2024-07-16 00:28:52.978842] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:08.712 true 00:34:08.712 true 00:34:08.712 true 00:34:08.712 true 00:34:08.712 Malloc0 00:34:08.712 Malloc1 00:34:08.712 Malloc2 00:34:08.712 Malloc3 00:34:08.712 [2024-07-16 00:28:55.354897] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:08.712 crypto_ram 00:34:08.712 [2024-07-16 00:28:55.362910] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:08.712 crypto_ram1 00:34:08.712 [2024-07-16 00:28:55.370939] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:08.712 crypto_ram2 00:34:08.712 [2024-07-16 00:28:55.378962] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:08.712 crypto_ram3 00:34:08.712 [ 00:34:08.712 { 00:34:08.712 "name": "Malloc1", 00:34:08.712 "aliases": [ 00:34:08.712 "48e4094a-64f5-48d1-b0f8-730c335e8205" 00:34:08.712 ], 00:34:08.712 "product_name": "Malloc disk", 00:34:08.712 "block_size": 512, 00:34:08.712 "num_blocks": 65536, 00:34:08.712 "uuid": "48e4094a-64f5-48d1-b0f8-730c335e8205", 00:34:08.712 "assigned_rate_limits": { 00:34:08.712 "rw_ios_per_sec": 0, 00:34:08.712 "rw_mbytes_per_sec": 0, 00:34:08.712 "r_mbytes_per_sec": 0, 00:34:08.712 "w_mbytes_per_sec": 0 00:34:08.712 }, 00:34:08.712 "claimed": true, 00:34:08.712 "claim_type": "exclusive_write", 00:34:08.712 "zoned": false, 00:34:08.712 "supported_io_types": { 00:34:08.712 "read": true, 00:34:08.712 "write": true, 00:34:08.712 "unmap": true, 00:34:08.712 "flush": true, 00:34:08.712 "reset": true, 00:34:08.712 "nvme_admin": false, 00:34:08.712 "nvme_io": false, 00:34:08.712 "nvme_io_md": false, 00:34:08.712 "write_zeroes": true, 00:34:08.712 "zcopy": true, 00:34:08.712 "get_zone_info": false, 00:34:08.712 "zone_management": false, 00:34:08.712 "zone_append": false, 00:34:08.712 "compare": false, 00:34:08.712 "compare_and_write": false, 00:34:08.712 "abort": true, 00:34:08.712 "seek_hole": false, 00:34:08.712 "seek_data": false, 00:34:08.712 "copy": true, 00:34:08.712 "nvme_iov_md": false 00:34:08.712 }, 00:34:08.712 "memory_domains": [ 00:34:08.712 { 00:34:08.712 "dma_device_id": "system", 00:34:08.712 "dma_device_type": 1 00:34:08.712 }, 00:34:08.712 { 00:34:08.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:08.712 "dma_device_type": 2 00:34:08.712 } 00:34:08.712 ], 00:34:08.712 "driver_specific": {} 00:34:08.712 } 00:34:08.712 ] 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:08.712 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:08.712 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:34:08.712 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:08.712 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:08.712 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:08.712 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:08.712 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:34:08.712 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:34:08.713 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:08.713 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:08.713 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:34:08.713 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:08.713 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:34:08.713 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:34:08.713 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "5e8c22ad-dfcb-54ec-ba61-073a0b06423d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5e8c22ad-dfcb-54ec-ba61-073a0b06423d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "85dcdbff-b2a1-5d6d-adf1-311414532d0b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "85dcdbff-b2a1-5d6d-adf1-311414532d0b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "35c9e89b-751c-58d0-be56-c23467f77519"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "35c9e89b-751c-58d0-be56-c23467f77519",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "d184b44e-f8ed-5d29-a520-5a48df216945"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d184b44e-f8ed-5d29-a520-5a48df216945",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:08.972 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:34:08.972 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:34:08.972 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:34:08.972 00:28:55 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 3691631 00:34:08.972 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 3691631 ']' 00:34:08.972 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 3691631 00:34:08.972 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:34:08.972 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:08.972 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3691631 00:34:08.972 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:08.972 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:08.972 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3691631' 00:34:08.972 killing process with pid 3691631 00:34:08.972 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 3691631 00:34:08.972 00:28:55 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 3691631 00:34:09.539 00:28:56 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:09.539 00:28:56 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:09.539 00:28:56 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:34:09.539 00:28:56 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:09.539 00:28:56 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:09.539 ************************************ 00:34:09.539 START TEST bdev_hello_world 00:34:09.539 ************************************ 00:34:09.539 00:28:56 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:09.539 [2024-07-16 00:28:56.388364] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:34:09.539 [2024-07-16 00:28:56.388427] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3692174 ] 00:34:09.798 [2024-07-16 00:28:56.517240] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:09.798 [2024-07-16 00:28:56.613687] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:09.798 [2024-07-16 00:28:56.634978] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:09.798 [2024-07-16 00:28:56.642991] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:09.798 [2024-07-16 00:28:56.651017] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:10.056 [2024-07-16 00:28:56.756647] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:12.586 [2024-07-16 00:28:58.967819] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:12.586 [2024-07-16 00:28:58.967890] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:12.586 [2024-07-16 00:28:58.967905] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:12.586 [2024-07-16 00:28:58.975837] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:12.586 [2024-07-16 00:28:58.975856] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:12.586 [2024-07-16 00:28:58.975868] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:12.586 [2024-07-16 00:28:58.983857] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:12.586 [2024-07-16 00:28:58.983876] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:12.586 [2024-07-16 00:28:58.983888] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:12.586 [2024-07-16 00:28:58.991876] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:12.586 [2024-07-16 00:28:58.991893] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:12.586 [2024-07-16 00:28:58.991905] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:12.586 [2024-07-16 00:28:59.064529] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:12.586 [2024-07-16 00:28:59.064571] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:12.586 [2024-07-16 00:28:59.064590] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:12.586 [2024-07-16 00:28:59.065854] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:12.586 [2024-07-16 00:28:59.065924] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:12.586 [2024-07-16 00:28:59.065951] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:12.586 [2024-07-16 00:28:59.065995] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:12.586 00:34:12.586 [2024-07-16 00:28:59.066013] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:12.586 00:34:12.586 real 0m3.097s 00:34:12.586 user 0m2.706s 00:34:12.586 sys 0m0.353s 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:12.586 ************************************ 00:34:12.586 END TEST bdev_hello_world 00:34:12.586 ************************************ 00:34:12.586 00:28:59 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:12.586 00:28:59 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:34:12.586 00:28:59 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:12.586 00:28:59 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:12.586 00:28:59 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:12.586 ************************************ 00:34:12.586 START TEST bdev_bounds 00:34:12.586 ************************************ 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3692544 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3692544' 00:34:12.586 Process bdevio pid: 3692544 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3692544 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 3692544 ']' 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:12.586 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:12.586 00:28:59 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:12.845 [2024-07-16 00:28:59.623682] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:34:12.845 [2024-07-16 00:28:59.623819] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3692544 ] 00:34:13.103 [2024-07-16 00:28:59.813058] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:13.103 [2024-07-16 00:28:59.916847] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:13.103 [2024-07-16 00:28:59.916961] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:13.103 [2024-07-16 00:28:59.916964] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:13.103 [2024-07-16 00:28:59.938550] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:13.103 [2024-07-16 00:28:59.946577] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:13.103 [2024-07-16 00:28:59.954598] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:13.361 [2024-07-16 00:29:00.068519] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:15.893 [2024-07-16 00:29:02.278039] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:15.893 [2024-07-16 00:29:02.278117] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:15.893 [2024-07-16 00:29:02.278133] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.893 [2024-07-16 00:29:02.286059] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:15.893 [2024-07-16 00:29:02.286079] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:15.893 [2024-07-16 00:29:02.286091] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.893 [2024-07-16 00:29:02.294079] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:15.893 [2024-07-16 00:29:02.294096] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:15.893 [2024-07-16 00:29:02.294108] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.893 [2024-07-16 00:29:02.302106] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:15.893 [2024-07-16 00:29:02.302123] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:15.893 [2024-07-16 00:29:02.302135] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.893 00:29:02 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:15.893 00:29:02 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:34:15.893 00:29:02 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:15.893 I/O targets: 00:34:15.893 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:34:15.893 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:34:15.893 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:34:15.893 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:34:15.893 00:34:15.893 00:34:15.893 CUnit - A unit testing framework for C - Version 2.1-3 00:34:15.893 http://cunit.sourceforge.net/ 00:34:15.893 00:34:15.893 00:34:15.893 Suite: bdevio tests on: crypto_ram3 00:34:15.893 Test: blockdev write read block ...passed 00:34:15.893 Test: blockdev write zeroes read block ...passed 00:34:15.893 Test: blockdev write zeroes read no split ...passed 00:34:15.893 Test: blockdev write zeroes read split ...passed 00:34:15.893 Test: blockdev write zeroes read split partial ...passed 00:34:15.893 Test: blockdev reset ...passed 00:34:15.893 Test: blockdev write read 8 blocks ...passed 00:34:15.893 Test: blockdev write read size > 128k ...passed 00:34:15.893 Test: blockdev write read invalid size ...passed 00:34:15.893 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:15.893 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:15.893 Test: blockdev write read max offset ...passed 00:34:15.893 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:15.893 Test: blockdev writev readv 8 blocks ...passed 00:34:15.893 Test: blockdev writev readv 30 x 1block ...passed 00:34:15.893 Test: blockdev writev readv block ...passed 00:34:15.893 Test: blockdev writev readv size > 128k ...passed 00:34:15.893 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:15.893 Test: blockdev comparev and writev ...passed 00:34:15.893 Test: blockdev nvme passthru rw ...passed 00:34:15.893 Test: blockdev nvme passthru vendor specific ...passed 00:34:15.893 Test: blockdev nvme admin passthru ...passed 00:34:15.893 Test: blockdev copy ...passed 00:34:15.893 Suite: bdevio tests on: crypto_ram2 00:34:15.893 Test: blockdev write read block ...passed 00:34:15.893 Test: blockdev write zeroes read block ...passed 00:34:15.893 Test: blockdev write zeroes read no split ...passed 00:34:15.893 Test: blockdev write zeroes read split ...passed 00:34:15.893 Test: blockdev write zeroes read split partial ...passed 00:34:15.893 Test: blockdev reset ...passed 00:34:15.893 Test: blockdev write read 8 blocks ...passed 00:34:15.893 Test: blockdev write read size > 128k ...passed 00:34:15.893 Test: blockdev write read invalid size ...passed 00:34:15.893 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:15.893 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:15.893 Test: blockdev write read max offset ...passed 00:34:15.893 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:15.893 Test: blockdev writev readv 8 blocks ...passed 00:34:15.893 Test: blockdev writev readv 30 x 1block ...passed 00:34:15.893 Test: blockdev writev readv block ...passed 00:34:15.893 Test: blockdev writev readv size > 128k ...passed 00:34:15.893 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:15.893 Test: blockdev comparev and writev ...passed 00:34:15.893 Test: blockdev nvme passthru rw ...passed 00:34:15.893 Test: blockdev nvme passthru vendor specific ...passed 00:34:15.893 Test: blockdev nvme admin passthru ...passed 00:34:15.893 Test: blockdev copy ...passed 00:34:15.893 Suite: bdevio tests on: crypto_ram1 00:34:15.893 Test: blockdev write read block ...passed 00:34:15.893 Test: blockdev write zeroes read block ...passed 00:34:15.893 Test: blockdev write zeroes read no split ...passed 00:34:15.893 Test: blockdev write zeroes read split ...passed 00:34:16.153 Test: blockdev write zeroes read split partial ...passed 00:34:16.153 Test: blockdev reset ...passed 00:34:16.153 Test: blockdev write read 8 blocks ...passed 00:34:16.153 Test: blockdev write read size > 128k ...passed 00:34:16.153 Test: blockdev write read invalid size ...passed 00:34:16.153 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:16.153 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:16.153 Test: blockdev write read max offset ...passed 00:34:16.153 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:16.153 Test: blockdev writev readv 8 blocks ...passed 00:34:16.153 Test: blockdev writev readv 30 x 1block ...passed 00:34:16.153 Test: blockdev writev readv block ...passed 00:34:16.153 Test: blockdev writev readv size > 128k ...passed 00:34:16.153 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:16.153 Test: blockdev comparev and writev ...passed 00:34:16.153 Test: blockdev nvme passthru rw ...passed 00:34:16.153 Test: blockdev nvme passthru vendor specific ...passed 00:34:16.153 Test: blockdev nvme admin passthru ...passed 00:34:16.153 Test: blockdev copy ...passed 00:34:16.153 Suite: bdevio tests on: crypto_ram 00:34:16.153 Test: blockdev write read block ...passed 00:34:16.153 Test: blockdev write zeroes read block ...passed 00:34:16.153 Test: blockdev write zeroes read no split ...passed 00:34:16.412 Test: blockdev write zeroes read split ...passed 00:34:16.412 Test: blockdev write zeroes read split partial ...passed 00:34:16.412 Test: blockdev reset ...passed 00:34:16.412 Test: blockdev write read 8 blocks ...passed 00:34:16.412 Test: blockdev write read size > 128k ...passed 00:34:16.412 Test: blockdev write read invalid size ...passed 00:34:16.412 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:16.412 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:16.412 Test: blockdev write read max offset ...passed 00:34:16.412 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:16.412 Test: blockdev writev readv 8 blocks ...passed 00:34:16.412 Test: blockdev writev readv 30 x 1block ...passed 00:34:16.412 Test: blockdev writev readv block ...passed 00:34:16.412 Test: blockdev writev readv size > 128k ...passed 00:34:16.412 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:16.412 Test: blockdev comparev and writev ...passed 00:34:16.412 Test: blockdev nvme passthru rw ...passed 00:34:16.412 Test: blockdev nvme passthru vendor specific ...passed 00:34:16.412 Test: blockdev nvme admin passthru ...passed 00:34:16.412 Test: blockdev copy ...passed 00:34:16.412 00:34:16.412 Run Summary: Type Total Ran Passed Failed Inactive 00:34:16.412 suites 4 4 n/a 0 0 00:34:16.412 tests 92 92 92 0 0 00:34:16.412 asserts 520 520 520 0 n/a 00:34:16.412 00:34:16.412 Elapsed time = 1.568 seconds 00:34:16.412 0 00:34:16.412 00:29:03 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3692544 00:34:16.412 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 3692544 ']' 00:34:16.412 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 3692544 00:34:16.412 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:34:16.412 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:16.412 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3692544 00:34:16.671 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:16.671 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:16.671 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3692544' 00:34:16.671 killing process with pid 3692544 00:34:16.671 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 3692544 00:34:16.671 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 3692544 00:34:16.930 00:29:03 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:34:16.930 00:34:16.930 real 0m4.274s 00:34:16.930 user 0m11.217s 00:34:16.930 sys 0m0.667s 00:34:16.930 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:16.930 00:29:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:16.930 ************************************ 00:34:16.930 END TEST bdev_bounds 00:34:16.930 ************************************ 00:34:16.930 00:29:03 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:16.930 00:29:03 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:16.930 00:29:03 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:34:16.930 00:29:03 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:16.930 00:29:03 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:16.930 ************************************ 00:34:16.930 START TEST bdev_nbd 00:34:16.930 ************************************ 00:34:16.930 00:29:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:16.930 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:34:16.930 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:34:16.930 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:16.930 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3693105 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3693105 /var/tmp/spdk-nbd.sock 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 3693105 ']' 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:17.190 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:17.190 00:29:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:17.190 [2024-07-16 00:29:03.943027] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:34:17.190 [2024-07-16 00:29:03.943096] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:17.190 [2024-07-16 00:29:04.074166] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:17.450 [2024-07-16 00:29:04.176724] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:17.450 [2024-07-16 00:29:04.198047] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:17.450 [2024-07-16 00:29:04.206070] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:17.450 [2024-07-16 00:29:04.214090] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:17.450 [2024-07-16 00:29:04.318380] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:19.984 [2024-07-16 00:29:06.533045] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:19.984 [2024-07-16 00:29:06.533113] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:19.984 [2024-07-16 00:29:06.533128] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:19.984 [2024-07-16 00:29:06.541065] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:19.984 [2024-07-16 00:29:06.541085] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:19.984 [2024-07-16 00:29:06.541098] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:19.984 [2024-07-16 00:29:06.549086] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:19.984 [2024-07-16 00:29:06.549104] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:19.984 [2024-07-16 00:29:06.549115] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:19.984 [2024-07-16 00:29:06.557107] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:19.984 [2024-07-16 00:29:06.557125] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:19.984 [2024-07-16 00:29:06.557137] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:19.984 1+0 records in 00:34:19.984 1+0 records out 00:34:19.984 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300436 s, 13.6 MB/s 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:19.984 00:29:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:34:20.242 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:20.242 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:20.242 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:20.243 1+0 records in 00:34:20.243 1+0 records out 00:34:20.243 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000210326 s, 19.5 MB/s 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:20.243 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:20.809 1+0 records in 00:34:20.809 1+0 records out 00:34:20.809 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306535 s, 13.4 MB/s 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:20.809 1+0 records in 00:34:20.809 1+0 records out 00:34:20.809 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000353095 s, 11.6 MB/s 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:20.809 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:21.068 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:21.068 00:29:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:21.068 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:21.068 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:21.068 00:29:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:21.636 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:21.636 { 00:34:21.636 "nbd_device": "/dev/nbd0", 00:34:21.636 "bdev_name": "crypto_ram" 00:34:21.636 }, 00:34:21.636 { 00:34:21.636 "nbd_device": "/dev/nbd1", 00:34:21.636 "bdev_name": "crypto_ram1" 00:34:21.636 }, 00:34:21.636 { 00:34:21.636 "nbd_device": "/dev/nbd2", 00:34:21.636 "bdev_name": "crypto_ram2" 00:34:21.636 }, 00:34:21.636 { 00:34:21.636 "nbd_device": "/dev/nbd3", 00:34:21.636 "bdev_name": "crypto_ram3" 00:34:21.636 } 00:34:21.636 ]' 00:34:21.636 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:21.636 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:21.636 { 00:34:21.636 "nbd_device": "/dev/nbd0", 00:34:21.636 "bdev_name": "crypto_ram" 00:34:21.636 }, 00:34:21.636 { 00:34:21.636 "nbd_device": "/dev/nbd1", 00:34:21.636 "bdev_name": "crypto_ram1" 00:34:21.636 }, 00:34:21.636 { 00:34:21.636 "nbd_device": "/dev/nbd2", 00:34:21.636 "bdev_name": "crypto_ram2" 00:34:21.636 }, 00:34:21.636 { 00:34:21.636 "nbd_device": "/dev/nbd3", 00:34:21.636 "bdev_name": "crypto_ram3" 00:34:21.636 } 00:34:21.636 ]' 00:34:21.636 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:21.636 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:34:21.636 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:21.636 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:34:21.636 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:21.636 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:21.636 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:21.637 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:21.896 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:21.896 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:21.896 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:21.896 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:21.896 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:21.896 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:21.896 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:21.896 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:21.896 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:21.896 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:22.195 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:22.195 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:22.195 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:22.195 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:22.195 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:22.195 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:22.195 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:22.195 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:22.195 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:22.195 00:29:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:22.486 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:34:22.745 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:22.745 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:22.745 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:22.745 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:22.745 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:22.745 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:22.745 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:23.004 00:29:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:34:23.263 /dev/nbd0 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:23.263 1+0 records in 00:34:23.263 1+0 records out 00:34:23.263 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00114935 s, 3.6 MB/s 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:23.263 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:34:23.522 /dev/nbd1 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:23.522 1+0 records in 00:34:23.522 1+0 records out 00:34:23.522 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256818 s, 15.9 MB/s 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:23.522 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:34:23.781 /dev/nbd10 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:23.781 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:23.781 1+0 records in 00:34:23.781 1+0 records out 00:34:23.781 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000331346 s, 12.4 MB/s 00:34:23.782 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:23.782 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:23.782 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:23.782 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:23.782 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:23.782 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:23.782 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:23.782 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:34:24.041 /dev/nbd11 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:24.041 1+0 records in 00:34:24.041 1+0 records out 00:34:24.041 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000358014 s, 11.4 MB/s 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:24.041 00:29:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:34:24.300 { 00:34:24.300 "nbd_device": "/dev/nbd0", 00:34:24.300 "bdev_name": "crypto_ram" 00:34:24.300 }, 00:34:24.300 { 00:34:24.300 "nbd_device": "/dev/nbd1", 00:34:24.300 "bdev_name": "crypto_ram1" 00:34:24.300 }, 00:34:24.300 { 00:34:24.300 "nbd_device": "/dev/nbd10", 00:34:24.300 "bdev_name": "crypto_ram2" 00:34:24.300 }, 00:34:24.300 { 00:34:24.300 "nbd_device": "/dev/nbd11", 00:34:24.300 "bdev_name": "crypto_ram3" 00:34:24.300 } 00:34:24.300 ]' 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:34:24.300 { 00:34:24.300 "nbd_device": "/dev/nbd0", 00:34:24.300 "bdev_name": "crypto_ram" 00:34:24.300 }, 00:34:24.300 { 00:34:24.300 "nbd_device": "/dev/nbd1", 00:34:24.300 "bdev_name": "crypto_ram1" 00:34:24.300 }, 00:34:24.300 { 00:34:24.300 "nbd_device": "/dev/nbd10", 00:34:24.300 "bdev_name": "crypto_ram2" 00:34:24.300 }, 00:34:24.300 { 00:34:24.300 "nbd_device": "/dev/nbd11", 00:34:24.300 "bdev_name": "crypto_ram3" 00:34:24.300 } 00:34:24.300 ]' 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:34:24.300 /dev/nbd1 00:34:24.300 /dev/nbd10 00:34:24.300 /dev/nbd11' 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:34:24.300 /dev/nbd1 00:34:24.300 /dev/nbd10 00:34:24.300 /dev/nbd11' 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:34:24.300 256+0 records in 00:34:24.300 256+0 records out 00:34:24.300 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00646118 s, 162 MB/s 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:24.300 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:34:24.561 256+0 records in 00:34:24.561 256+0 records out 00:34:24.561 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0834388 s, 12.6 MB/s 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:34:24.561 256+0 records in 00:34:24.561 256+0 records out 00:34:24.561 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0654749 s, 16.0 MB/s 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:34:24.561 256+0 records in 00:34:24.561 256+0 records out 00:34:24.561 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0570567 s, 18.4 MB/s 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:34:24.561 256+0 records in 00:34:24.561 256+0 records out 00:34:24.561 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0502065 s, 20.9 MB/s 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:24.561 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:34:24.819 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:24.819 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:34:24.819 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:24.819 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:34:24.819 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:24.820 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:24.820 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:24.820 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:24.820 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:24.820 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:24.820 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:24.820 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:25.078 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:25.078 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:25.078 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:25.078 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:25.078 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:25.078 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:25.078 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:25.078 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:25.078 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:25.078 00:29:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:25.643 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:25.643 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:25.643 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:25.643 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:25.643 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:25.643 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:25.643 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:25.643 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:25.643 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:25.643 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:34:25.901 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:34:25.901 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:34:25.901 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:34:25.901 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:25.901 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:25.901 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:34:25.901 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:25.901 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:25.901 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:25.901 00:29:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:26.467 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:34:26.726 malloc_lvol_verify 00:34:26.726 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:34:26.983 c4d2581d-cf00-4673-ac8c-71e5ab5aafb8 00:34:26.983 00:29:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:34:27.240 830a27e7-424d-49f2-9083-b052d6a5c403 00:34:27.240 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:34:27.498 /dev/nbd0 00:34:27.498 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:34:27.498 mke2fs 1.46.5 (30-Dec-2021) 00:34:27.498 Discarding device blocks: 0/4096 done 00:34:27.498 Creating filesystem with 4096 1k blocks and 1024 inodes 00:34:27.498 00:34:27.498 Allocating group tables: 0/1 done 00:34:27.498 Writing inode tables: 0/1 done 00:34:27.498 Creating journal (1024 blocks): done 00:34:27.498 Writing superblocks and filesystem accounting information: 0/1 done 00:34:27.498 00:34:27.498 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:34:27.498 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:34:27.498 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:27.498 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:27.498 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:27.498 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:27.498 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:27.498 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3693105 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 3693105 ']' 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 3693105 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3693105 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3693105' 00:34:27.758 killing process with pid 3693105 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 3693105 00:34:27.758 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 3693105 00:34:28.016 00:29:14 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:34:28.016 00:34:28.016 real 0m11.050s 00:34:28.016 user 0m14.455s 00:34:28.016 sys 0m4.643s 00:34:28.016 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:28.016 00:29:14 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:28.016 ************************************ 00:34:28.016 END TEST bdev_nbd 00:34:28.016 ************************************ 00:34:28.276 00:29:14 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:28.276 00:29:14 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:34:28.276 00:29:14 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:34:28.276 00:29:14 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:34:28.276 00:29:14 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:34:28.276 00:29:14 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:28.276 00:29:14 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:28.276 00:29:14 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:28.276 ************************************ 00:34:28.276 START TEST bdev_fio 00:34:28.276 ************************************ 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:28.276 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:28.276 ************************************ 00:34:28.276 START TEST bdev_fio_rw_verify 00:34:28.276 ************************************ 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:28.276 00:29:15 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:28.844 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:28.844 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:28.844 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:28.844 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:28.844 fio-3.35 00:34:28.844 Starting 4 threads 00:34:43.715 00:34:43.715 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3695251: Tue Jul 16 00:29:28 2024 00:34:43.715 read: IOPS=15.3k, BW=59.9MiB/s (62.8MB/s)(599MiB/10001msec) 00:34:43.715 slat (usec): min=17, max=419, avg=89.71, stdev=47.75 00:34:43.715 clat (usec): min=32, max=2359, avg=476.19, stdev=290.72 00:34:43.715 lat (usec): min=71, max=2439, avg=565.90, stdev=313.76 00:34:43.715 clat percentiles (usec): 00:34:43.715 | 50.000th=[ 404], 99.000th=[ 1352], 99.900th=[ 1631], 99.990th=[ 1795], 00:34:43.715 | 99.999th=[ 1926] 00:34:43.715 write: IOPS=16.9k, BW=66.0MiB/s (69.2MB/s)(644MiB/9751msec); 0 zone resets 00:34:43.715 slat (usec): min=26, max=1550, avg=106.50, stdev=48.01 00:34:43.715 clat (usec): min=33, max=2249, avg=536.19, stdev=318.98 00:34:43.715 lat (usec): min=89, max=2358, avg=642.68, stdev=341.73 00:34:43.715 clat percentiles (usec): 00:34:43.715 | 50.000th=[ 465], 99.000th=[ 1467], 99.900th=[ 1762], 99.990th=[ 1975], 00:34:43.715 | 99.999th=[ 2180] 00:34:43.715 bw ( KiB/s): min=52512, max=84141, per=98.45%, avg=66563.53, stdev=2349.05, samples=76 00:34:43.715 iops : min=13128, max=21035, avg=16640.79, stdev=587.27, samples=76 00:34:43.715 lat (usec) : 50=0.01%, 100=1.05%, 250=21.25%, 500=34.85%, 750=22.17% 00:34:43.715 lat (usec) : 1000=12.89% 00:34:43.715 lat (msec) : 2=7.77%, 4=0.01% 00:34:43.715 cpu : usr=99.47%, sys=0.00%, ctx=96, majf=0, minf=310 00:34:43.715 IO depths : 1=8.4%, 2=26.2%, 4=52.4%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:43.715 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:43.715 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:43.715 issued rwts: total=153310,164821,0,0 short=0,0,0,0 dropped=0,0,0,0 00:34:43.715 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:43.715 00:34:43.715 Run status group 0 (all jobs): 00:34:43.715 READ: bw=59.9MiB/s (62.8MB/s), 59.9MiB/s-59.9MiB/s (62.8MB/s-62.8MB/s), io=599MiB (628MB), run=10001-10001msec 00:34:43.715 WRITE: bw=66.0MiB/s (69.2MB/s), 66.0MiB/s-66.0MiB/s (69.2MB/s-69.2MB/s), io=644MiB (675MB), run=9751-9751msec 00:34:43.715 00:34:43.715 real 0m13.702s 00:34:43.715 user 0m46.030s 00:34:43.715 sys 0m0.532s 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:34:43.715 ************************************ 00:34:43.715 END TEST bdev_fio_rw_verify 00:34:43.715 ************************************ 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "5e8c22ad-dfcb-54ec-ba61-073a0b06423d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5e8c22ad-dfcb-54ec-ba61-073a0b06423d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "85dcdbff-b2a1-5d6d-adf1-311414532d0b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "85dcdbff-b2a1-5d6d-adf1-311414532d0b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "35c9e89b-751c-58d0-be56-c23467f77519"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "35c9e89b-751c-58d0-be56-c23467f77519",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "d184b44e-f8ed-5d29-a520-5a48df216945"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d184b44e-f8ed-5d29-a520-5a48df216945",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:34:43.715 crypto_ram1 00:34:43.715 crypto_ram2 00:34:43.715 crypto_ram3 ]] 00:34:43.715 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "5e8c22ad-dfcb-54ec-ba61-073a0b06423d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5e8c22ad-dfcb-54ec-ba61-073a0b06423d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "85dcdbff-b2a1-5d6d-adf1-311414532d0b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "85dcdbff-b2a1-5d6d-adf1-311414532d0b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "35c9e89b-751c-58d0-be56-c23467f77519"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "35c9e89b-751c-58d0-be56-c23467f77519",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "d184b44e-f8ed-5d29-a520-5a48df216945"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d184b44e-f8ed-5d29-a520-5a48df216945",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:43.716 00:29:28 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:43.716 ************************************ 00:34:43.716 START TEST bdev_fio_trim 00:34:43.716 ************************************ 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:43.716 00:29:29 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:43.716 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:43.716 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:43.716 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:43.716 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:43.716 fio-3.35 00:34:43.716 Starting 4 threads 00:34:55.953 00:34:55.953 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3697079: Tue Jul 16 00:29:42 2024 00:34:55.953 write: IOPS=19.9k, BW=77.6MiB/s (81.4MB/s)(776MiB/10001msec); 0 zone resets 00:34:55.953 slat (usec): min=39, max=531, avg=115.66, stdev=36.79 00:34:55.953 clat (usec): min=47, max=2052, avg=425.18, stdev=204.95 00:34:55.953 lat (usec): min=105, max=2230, avg=540.84, stdev=218.86 00:34:55.953 clat percentiles (usec): 00:34:55.953 | 50.000th=[ 400], 99.000th=[ 955], 99.900th=[ 1074], 99.990th=[ 1205], 00:34:55.953 | 99.999th=[ 1319] 00:34:55.953 bw ( KiB/s): min=69312, max=122878, per=100.00%, avg=79909.37, stdev=3508.20, samples=76 00:34:55.953 iops : min=17328, max=30718, avg=19977.21, stdev=876.99, samples=76 00:34:55.953 trim: IOPS=19.9k, BW=77.6MiB/s (81.4MB/s)(776MiB/10001msec); 0 zone resets 00:34:55.953 slat (usec): min=7, max=1514, avg=33.22, stdev=14.80 00:34:55.953 clat (usec): min=31, max=2231, avg=541.16, stdev=218.90 00:34:55.953 lat (usec): min=50, max=2275, avg=574.38, stdev=224.62 00:34:55.953 clat percentiles (usec): 00:34:55.953 | 50.000th=[ 515], 99.000th=[ 1106], 99.900th=[ 1221], 99.990th=[ 1401], 00:34:55.953 | 99.999th=[ 1516] 00:34:55.953 bw ( KiB/s): min=69312, max=122878, per=100.00%, avg=79909.37, stdev=3508.33, samples=76 00:34:55.953 iops : min=17328, max=30718, avg=19977.21, stdev=877.05, samples=76 00:34:55.953 lat (usec) : 50=0.01%, 100=0.34%, 250=14.52%, 500=42.37%, 750=29.68% 00:34:55.953 lat (usec) : 1000=11.34% 00:34:55.953 lat (msec) : 2=1.75%, 4=0.01% 00:34:55.953 cpu : usr=99.39%, sys=0.00%, ctx=86, majf=0, minf=113 00:34:55.953 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:55.953 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:55.953 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:55.953 issued rwts: total=0,198743,198744,0 short=0,0,0,0 dropped=0,0,0,0 00:34:55.953 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:55.953 00:34:55.953 Run status group 0 (all jobs): 00:34:55.953 WRITE: bw=77.6MiB/s (81.4MB/s), 77.6MiB/s-77.6MiB/s (81.4MB/s-81.4MB/s), io=776MiB (814MB), run=10001-10001msec 00:34:55.953 TRIM: bw=77.6MiB/s (81.4MB/s), 77.6MiB/s-77.6MiB/s (81.4MB/s-81.4MB/s), io=776MiB (814MB), run=10001-10001msec 00:34:55.953 00:34:55.953 real 0m13.520s 00:34:55.953 user 0m45.861s 00:34:55.953 sys 0m0.511s 00:34:55.953 00:29:42 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:55.953 00:29:42 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:55.953 ************************************ 00:34:55.953 END TEST bdev_fio_trim 00:34:55.953 ************************************ 00:34:55.953 00:29:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:55.953 00:29:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:34:55.953 00:29:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:55.953 00:29:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:34:55.953 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:55.953 00:29:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:34:55.954 00:34:55.954 real 0m27.578s 00:34:55.954 user 1m32.069s 00:34:55.954 sys 0m1.245s 00:34:55.954 00:29:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:55.954 00:29:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:55.954 ************************************ 00:34:55.954 END TEST bdev_fio 00:34:55.954 ************************************ 00:34:55.954 00:29:42 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:55.954 00:29:42 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:55.954 00:29:42 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:55.954 00:29:42 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:55.954 00:29:42 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:55.954 00:29:42 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:55.954 ************************************ 00:34:55.954 START TEST bdev_verify 00:34:55.954 ************************************ 00:34:55.954 00:29:42 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:55.954 [2024-07-16 00:29:42.732746] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:34:55.954 [2024-07-16 00:29:42.732810] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3698414 ] 00:34:55.954 [2024-07-16 00:29:42.852769] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:56.278 [2024-07-16 00:29:42.960441] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:56.278 [2024-07-16 00:29:42.960446] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:56.278 [2024-07-16 00:29:42.981820] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:56.278 [2024-07-16 00:29:42.989848] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:56.278 [2024-07-16 00:29:42.997880] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:56.278 [2024-07-16 00:29:43.103494] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:58.815 [2024-07-16 00:29:45.318484] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:58.815 [2024-07-16 00:29:45.318573] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:58.815 [2024-07-16 00:29:45.318589] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:58.815 [2024-07-16 00:29:45.326505] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:58.815 [2024-07-16 00:29:45.326524] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:58.815 [2024-07-16 00:29:45.326536] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:58.815 [2024-07-16 00:29:45.334529] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:58.816 [2024-07-16 00:29:45.334546] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:58.816 [2024-07-16 00:29:45.334558] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:58.816 [2024-07-16 00:29:45.342550] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:58.816 [2024-07-16 00:29:45.342568] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:58.816 [2024-07-16 00:29:45.342579] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:58.816 Running I/O for 5 seconds... 00:35:04.081 00:35:04.081 Latency(us) 00:35:04.081 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:04.081 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:04.081 Verification LBA range: start 0x0 length 0x1000 00:35:04.081 crypto_ram : 5.07 476.63 1.86 0.00 0.00 267654.84 5271.37 164124.94 00:35:04.081 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:04.081 Verification LBA range: start 0x1000 length 0x1000 00:35:04.081 crypto_ram : 5.08 381.19 1.49 0.00 0.00 333844.09 1467.44 204244.37 00:35:04.081 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:04.081 Verification LBA range: start 0x0 length 0x1000 00:35:04.081 crypto_ram1 : 5.07 479.59 1.87 0.00 0.00 265569.82 5356.86 154095.08 00:35:04.081 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:04.081 Verification LBA range: start 0x1000 length 0x1000 00:35:04.081 crypto_ram1 : 5.08 384.13 1.50 0.00 0.00 330443.02 933.18 185096.46 00:35:04.081 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:04.081 Verification LBA range: start 0x0 length 0x1000 00:35:04.081 crypto_ram2 : 5.05 3675.21 14.36 0.00 0.00 34522.73 6753.06 26898.25 00:35:04.081 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:04.081 Verification LBA range: start 0x1000 length 0x1000 00:35:04.081 crypto_ram2 : 5.06 2986.96 11.67 0.00 0.00 42390.16 8776.13 31457.28 00:35:04.081 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:04.081 Verification LBA range: start 0x0 length 0x1000 00:35:04.081 crypto_ram3 : 5.05 3673.95 14.35 0.00 0.00 34444.70 6040.71 27012.23 00:35:04.081 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:04.081 Verification LBA range: start 0x1000 length 0x1000 00:35:04.081 crypto_ram3 : 5.07 3003.90 11.73 0.00 0.00 42060.10 2535.96 31457.28 00:35:04.081 =================================================================================================================== 00:35:04.081 Total : 15061.56 58.83 0.00 0.00 67524.45 933.18 204244.37 00:35:04.081 00:35:04.081 real 0m8.288s 00:35:04.081 user 0m15.687s 00:35:04.081 sys 0m0.390s 00:35:04.081 00:29:50 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:04.081 00:29:50 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:35:04.081 ************************************ 00:35:04.081 END TEST bdev_verify 00:35:04.081 ************************************ 00:35:04.081 00:29:50 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:04.081 00:29:51 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:04.081 00:29:51 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:04.081 00:29:51 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:04.081 00:29:51 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:04.340 ************************************ 00:35:04.340 START TEST bdev_verify_big_io 00:35:04.340 ************************************ 00:35:04.340 00:29:51 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:04.340 [2024-07-16 00:29:51.092437] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:04.340 [2024-07-16 00:29:51.092498] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3699478 ] 00:35:04.340 [2024-07-16 00:29:51.211178] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:04.599 [2024-07-16 00:29:51.315045] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:04.599 [2024-07-16 00:29:51.315052] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:04.599 [2024-07-16 00:29:51.336460] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:04.599 [2024-07-16 00:29:51.344474] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:04.599 [2024-07-16 00:29:51.352502] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:04.599 [2024-07-16 00:29:51.471193] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:07.149 [2024-07-16 00:29:53.689677] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:07.149 [2024-07-16 00:29:53.689757] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:07.149 [2024-07-16 00:29:53.689772] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:07.149 [2024-07-16 00:29:53.697697] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:07.149 [2024-07-16 00:29:53.697716] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:07.149 [2024-07-16 00:29:53.697728] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:07.149 [2024-07-16 00:29:53.705721] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:07.149 [2024-07-16 00:29:53.705738] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:07.149 [2024-07-16 00:29:53.705750] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:07.149 [2024-07-16 00:29:53.713742] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:07.149 [2024-07-16 00:29:53.713759] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:07.149 [2024-07-16 00:29:53.713771] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:07.149 Running I/O for 5 seconds... 00:35:07.716 [2024-07-16 00:29:54.642701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.643249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.643350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.643426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.643491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.643547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.644045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.644069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.648559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.648621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.648674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.648725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.649374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.649432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.649485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.649547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.650098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.650120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.654421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.654478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.654530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.654583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.655158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.655233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.655294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.655348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.655882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.655903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.660254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.660316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.660367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.660419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.661015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.661071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.661124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.716 [2024-07-16 00:29:54.661176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.717 [2024-07-16 00:29:54.661661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.717 [2024-07-16 00:29:54.661682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.717 [2024-07-16 00:29:54.665903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.717 [2024-07-16 00:29:54.665967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.717 [2024-07-16 00:29:54.666020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.717 [2024-07-16 00:29:54.666072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.717 [2024-07-16 00:29:54.666649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.717 [2024-07-16 00:29:54.666737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.717 [2024-07-16 00:29:54.666791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.717 [2024-07-16 00:29:54.666858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.667422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.667444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.671602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.671660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.671715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.671766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.672384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.672442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.672494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.672546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.673063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.673085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.677134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.677190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.677243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.677296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.677892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.677955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.678008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.678071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.678590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.678612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.682739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.682801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.682853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.682921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.683604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.683660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.683712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.683765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.684248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.684275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.688349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.688407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.688460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.688512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.689102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.689183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.689235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.689287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.689668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.689689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.692441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.692497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.692548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.692600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.693152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.693207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.693259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.693310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.693839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.693860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.697102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.697159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.697210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.697268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.697799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.697853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.697904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.697962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.698340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.698365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.701951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.702009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.702062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.702114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.702511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.702568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.702624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.702676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.703021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.703042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.705647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.705705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.705757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.705819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.706468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.979 [2024-07-16 00:29:54.706524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.706577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.706629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.707123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.707144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.710253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.710314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.710365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.710417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.710879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.710939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.710991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.711042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.711382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.711402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.715089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.715144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.715195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.715246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.715697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.715752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.715803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.715854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.716197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.716218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.719063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.719120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.719172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.719224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.719801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.719856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.719907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.719965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.720555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.720576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.723388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.723443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.723494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.723545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.723979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.724034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.724097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.724149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.724490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.724511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.728333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.728396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.728450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.728505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.728901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.728962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.729014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.729066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.729401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.729421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.732271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.732328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.732380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.732431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.733015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.733091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.733144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.733195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.733738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.733762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.736480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.736538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.736596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.736655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.737056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.737111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.980 [2024-07-16 00:29:54.737161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.737213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.737550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.737570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.741282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.741348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.741399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.741457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.741846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.741910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.741968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.742020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.742358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.742379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.745154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.745211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.745263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.745316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.745899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.745960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.746020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.746073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.746671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.746693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.749353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.749409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.749460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.749512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.749975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.750030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.750087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.750143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.750485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.750506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.754123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.754184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.754236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.754295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.754686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.754748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.754803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.754855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.755198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.755219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.757828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.757884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.757942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.757994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.758563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.758618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.758674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.758726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.759287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.759310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.762073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.762129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.762180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.762231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.762646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.762700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.762752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.762803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.763144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.763165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.766482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.766545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.766604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.766655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.767088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.767142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.767193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.767244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.767584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.767604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.770071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.770148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.770200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.770252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.770852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.770908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.981 [2024-07-16 00:29:54.770965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.771017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.771503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.771524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.774157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.774215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.774275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.774327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.774778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.774832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.774890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.774949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.775294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.775315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.778434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.778491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.778548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.778600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.779035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.779091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.779141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.779200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.779547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.779567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.781949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.782010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.782061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.782113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.782571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.782626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.782681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.782732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.783310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.783332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.786210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.786266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.786317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.786362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.786747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.786804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.786856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.786909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.787256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.787277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.790558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.791324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.793042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.795010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.796520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.798226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.800185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.802029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.802546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.802568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.807414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.809013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.810853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.812845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.814732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.815231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.815724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.816221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.816595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.816616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.820878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.822847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.823358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.823851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.825061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.826792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.828741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.830687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.831226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.831248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.834292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.834785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.836580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.838429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.840470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.842315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.982 [2024-07-16 00:29:54.844211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.846181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.846617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.846639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.851910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.853879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.855042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.856964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.859284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.859785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.860283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.860773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.861290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.861311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.865388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.867361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.868541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.869042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.870069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.871794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.873638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.875585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.876010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.876032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.878873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.879376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.880512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.882222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.884533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.885778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.887497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.889457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.889806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.889827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.894643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.896613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.897998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.899741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.902077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.903318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.903810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.904314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.904866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.904890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.908838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.910803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.912647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.913153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.914190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.915317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.917043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.918992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.919339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.919360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.922174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.922668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.923172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.924884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:07.983 [2024-07-16 00:29:54.927196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.928363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.930085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.932038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.932384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.932405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.937402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.939368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.941206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.942841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.945233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.947075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.947569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.948067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.948663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.948685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.952802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.954771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.956743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.957584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.958647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.959147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.960866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.962755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.963185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.963206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.966149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.966648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.967148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.967648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.968703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.243 [2024-07-16 00:29:54.969212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.969713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.970209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.970727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.970748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.974411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.974910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.975417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.975913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.976978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.977492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.977996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.978491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.979044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.979066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.982759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.983265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.983762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.984283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.985257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.985754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.986259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.986754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.987328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.987350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.990931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.991430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.991939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.992433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.993439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.993943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.994450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.994956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.995507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.995528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.999120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:54.999622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.000127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.000620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.001644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.002148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.002647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.003145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.003733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.003754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.007366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.007866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.008370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.008863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.009922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.010433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.010933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.011423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.012028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.012050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.015772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.016276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.016777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.017277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.018334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.018856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.019366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.019858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.020419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.020441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.024069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.024571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.025082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.025575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.026662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.027169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.027664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.028167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.028704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.028726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.032159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.032662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.033162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.033653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.034734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.035240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.035734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.036230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.036782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.036805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.040253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.040761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.041261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.041754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.042813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.043318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.043815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.044317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.044866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.044888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.048313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.048815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.049315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.049807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.051525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.052359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.054047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.054543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.055102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.055123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.059346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.061185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.063069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.065086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.066036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.066531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.067026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.068759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.069111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.069133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.073190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.073694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.074191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.074683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.076847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.078808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.080768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.082263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.082655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.082676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.085694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.087250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.089009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.090981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.093010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.094851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.096815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.098063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.098670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.098692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.103191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.104470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.106196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.108160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.109679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.110177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.110669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.111352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.111759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.111780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.115817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.117774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.118287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.118778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.120206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.121919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.123880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.125780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.244 [2024-07-16 00:29:55.126273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.126294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.129040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.129537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.131150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.132959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.135132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.136845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.138690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.140645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.141065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.141085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.145682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.147536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.149059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.150892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.153191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.154592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.155110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.155601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.156154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.156176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.160003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.161973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.163941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.163998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.165035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.165530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.166633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.168370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.168714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.168739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.172590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.173093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.173586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.174082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.174141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.174487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.176294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.178262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.179568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.181297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.181643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.181664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.184265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.184336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.184389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.184441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.184979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.185047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.185099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.185152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.185204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.185620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.185640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.187674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.187730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.187791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.187842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.188183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.188254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.188310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.188363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.188414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.188866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.188887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.191687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.191742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.191793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.191846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.192190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.192261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.192315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.192369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.192430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.192773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.245 [2024-07-16 00:29:55.192794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.194862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.194948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.195001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.195052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.195607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.195671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.195723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.195776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.195828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.196404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.196426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.198572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.198627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.198686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.198738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.199165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.199238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.199290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.199349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.199402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.199738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.199759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.202582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.202638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.202690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.202743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.203299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.203372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.203424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.203476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.203527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.203912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.203938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.206037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.206092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.206144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.206218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.206559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.206631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.206683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.206734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.206785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.207233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.207254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.210176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.210241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.210297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.210349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.210690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.210761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.210813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.210865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.210919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.211267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.211288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.213351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.213407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.213469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.213522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.214120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.214191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.214244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.214296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.214349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.214901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.214922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.217118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.217173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.217226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.217284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.217693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.217761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.217813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.217865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.217931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.218276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.218301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.221025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.506 [2024-07-16 00:29:55.221080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.221133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.221185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.221720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.221786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.221841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.221893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.221950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.222372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.222392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.224507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.224563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.224619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.224695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.225040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.225112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.225164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.225215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.225266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.225722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.225742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.228583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.228663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.228715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.228767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.229116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.229191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.229247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.229299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.229355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.229694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.229714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.231779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.231835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.231887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.231965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.232571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.232637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.232689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.232743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.232797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.233334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.233356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.235590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.235645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.235695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.235746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.236257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.236329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.236380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.236432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.236482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.236897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.236917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.239435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.239491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.239547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.239600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.240137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.240207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.240261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.240327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.240387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.240792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.240812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.242913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.242975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.243026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.243077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.243418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.243494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.243569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.243624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.243676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.244078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.244099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.246997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.247060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.247137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.247191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.247535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.247606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.247659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.247709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.247761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.248106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.248127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.250244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.250301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.250354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.250412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.250966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.251032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.251085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.507 [2024-07-16 00:29:55.251138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.251191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.251717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.251738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.253965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.254020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.254075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.254127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.254656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.254727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.254778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.254830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.254881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.255265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.255286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.257793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.257849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.257900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.257958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.258515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.258586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.258638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.258705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.258759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.259152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.259174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.261284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.261340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.261391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.261442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.261776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.261845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.261904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.261966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.262017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.262437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.262458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.265376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.265431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.265489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.265553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.265895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.265972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.266026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.266077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.266128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.266464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.266484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.268615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.268679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.268736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.268788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.269313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.269378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.269430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.269484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.269536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.270029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.270051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.272355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.272414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.272465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.272517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.273032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.273109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.273162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.273213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.273265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.273650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.273670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.276158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.276215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.276266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.276318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.276867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.276937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.276993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.277049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.277115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.277461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.277482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.279568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.279624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.279675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.279726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.280066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.280137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.280198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.280251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.280311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.280653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.280674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.283662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.283717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.283773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.508 [2024-07-16 00:29:55.283832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.284177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.284252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.284314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.284366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.284418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.284762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.284782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.286871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.286932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.286993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.287046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.287640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.287710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.287763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.287815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.287869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.288370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.288405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.290704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.290760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.290811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.290863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.291351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.291424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.291476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.291528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.291579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.291955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.291976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.294442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.294515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.295017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.295072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.295479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.295546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.295598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.295654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.295712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.296062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.296082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.298152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.298207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.298258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.300212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.300726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.300799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.300852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.300903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.300964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.301504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.301528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.304876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.306853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.308820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.309709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.310341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.310850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.311351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.311847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.312343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.312891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.312913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.316538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.317041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.317538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.318042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.318614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.319125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.319622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.320124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.320620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.321176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.321201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.324781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.325283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.325778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.326281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.326848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.327359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.327856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.328355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.328850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.329397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.329425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.333042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.333537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.334041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.334542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.335107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.335611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.336112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.336613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.337111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.509 [2024-07-16 00:29:55.337667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.337688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.341293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.341789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.342288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.342789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.343349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.343854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.344353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.344858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.345357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.345911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.345938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.349463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.349959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.350455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.350957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.351545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.352055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.352549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.353051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.353560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.354120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.354142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.357659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.358160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.358658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.359160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.359738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.360249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.360741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.361245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.361746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.362301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.362323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.365797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.366305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.366801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.367303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.367901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.368410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.368903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.369407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.369917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.370466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.370488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.373992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.374489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.374988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.375486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.376079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.376590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.377087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.377586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.378090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.378636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.378658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.382187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.382683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.383183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.383682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.384282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.384789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.385285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.385786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.386290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.386840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.386862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.390320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.390816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.391316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.391813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.392409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.392932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.393425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.395065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.395836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.396268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.510 [2024-07-16 00:29:55.396290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.399799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.400311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.400805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.401407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.401785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.403745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.405701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.406862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.408595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.408943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.408964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.411994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.413715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.415681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.417644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.418167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.419906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.421867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.423828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.424334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.424888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.424910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.429296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.430469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.432192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.434151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.434497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.435034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.435528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.436038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.437396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.437806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.437826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.441678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.443459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.443959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.444457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.445046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.446266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.447991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.449945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.451624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.452002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.511 [2024-07-16 00:29:55.452023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.454735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.455243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.457076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.458955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.459298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.460879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.462718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.464685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.466647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.467106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.467127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.471801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.473667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.475176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.477011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.477355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.479348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.480555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.481056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.481548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.482106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.482129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.485873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.487826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.489782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.490289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.490829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.491347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.491895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.493625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.495581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.495924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.495950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.498718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.499221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.499714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.500851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.501237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.503237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.505019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.506169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.507883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.508237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.508258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.511588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.513330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.515290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.517252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.517710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.519438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.521398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.523259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.523761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.524312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.524337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.528905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.530166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.531885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.533809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.534159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.534671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.535172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.535671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.537035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.537454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.537475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.541420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.543184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.543672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.544169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.544727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.546135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.547861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.549741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.551242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.551587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.551608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.554334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.554834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.556645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.558495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.558844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.560475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.562315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.564294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.566257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.566752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.566773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.571426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.573365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.574804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.576648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.576998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.578972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.580281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.771 [2024-07-16 00:29:55.580780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.581284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.581831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.581853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.585714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.587686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.589653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.590556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.591132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.591639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.592138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.593854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.595823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.596174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.596195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.599057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.599556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.600062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.601007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.601394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.603384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.605342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.606478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.608213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.608558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.608578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.612175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.613900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.615841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.617693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.618173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.619909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.621843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.623687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.624193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.624736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.624758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.629319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.630484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.632224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.634197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.634540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.635059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.635556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.636054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.637625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.638031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.638057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.641897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.643341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.643858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.644359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.644899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.646418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.648120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.650065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.651489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.651834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.651857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.654624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.655129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.656970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.658914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.659263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.660696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.662486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.664439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.666401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.666890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.666911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.671439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.673438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.673498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.674651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.675005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.676953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.678918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.679918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.680422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.680963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.680987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.684354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.686067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.688067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.688129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.688474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.689083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.689581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.690078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.691186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.691569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.691591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.693661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.693717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.693769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.693830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.694175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.694252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.772 [2024-07-16 00:29:55.694304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.694356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.694408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.694874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.694895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.697867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.697938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.697992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.698048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.698388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.698463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.698516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.698567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.698619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.698959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.698980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.701075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.701139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.701193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.701245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.701764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.701838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.701893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.701954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.702007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.702504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.702525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.704906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.704974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.705026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.705079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.705530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.705600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.705661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.705712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.705763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.706163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.706185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.708611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.708670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.708722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.708795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.709431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.709502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.709557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.709610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.709662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.710055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.710076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.712175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.712237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.712290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.712346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.712684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.712754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.712807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.712858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.712909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.713255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.713276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.716255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.716329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.716383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.716434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.716802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.716874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.716934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.716986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.717038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.717375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.717396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.719657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.719715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.719771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.773 [2024-07-16 00:29:55.719822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.720385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.720448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.720500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.720566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.720618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.721222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.721244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.723300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.723368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.723421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.723476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.723876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.723955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.724008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.724064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.724115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.724452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.724473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.727036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.727104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.727157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.034 [2024-07-16 00:29:55.727209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.727756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.727819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.727872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.727924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.727984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.728511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.728533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.731583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.731654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.731719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.731775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.732210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.732285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.732339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.732391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.732456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.733072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.733094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.735955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.736013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.736069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.736121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.736667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.736732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.736785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.736838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.736897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.737500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.737522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.740355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.740428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.740483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.740535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.741076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.741145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.741198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.741256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.741308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.741823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.741844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.744816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.744891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.744950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.745018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.745611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.745686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.745754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.745807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.745860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.746377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.746398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.749247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.749304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.749362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.749415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.750004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.750070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.750126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.750180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.750232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.750772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.750793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.753815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.753876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.753939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.754004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.754589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.754655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.754708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.754760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.754812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.755248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.755269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.758161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.758219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.758271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.758329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.758822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.758902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.758989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.759079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.759135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.759617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.759638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.762502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.762558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.762610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.762661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.763199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.763263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.763316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.763371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.763424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.763969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.763991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.766864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.766921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.035 [2024-07-16 00:29:55.766985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.767038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.767581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.767655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.767708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.767761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.767814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.768365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.768386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.771384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.771443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.771500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.771552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.772000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.772075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.772142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.772194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.772273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.772787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.772808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.775671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.775744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.775804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.775857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.776434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.776502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.776556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.776607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.776663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.777161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.777186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.779984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.780044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.780097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.780148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.780636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.780709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.780764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.780819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.780871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.781416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.781437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.784343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.784401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.784453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.784505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.785003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.785077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.785132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.785186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.785252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.785742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.785762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.788779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.788841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.788909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.788981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.789529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.789600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.789653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.789704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.789762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.790314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.790336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.793420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.793478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.793531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.793584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.794115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.794197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.794252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.794304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.794355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.794896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.794919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.797802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.797859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.797911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.797970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.798507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.798599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.798653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.798721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.798786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.799307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.799330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.802439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.802509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.802562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.802615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.803062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.803143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.803212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.803266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.036 [2024-07-16 00:29:55.803318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.803893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.803915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.806792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.806848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.806905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.806965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.807508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.807571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.807625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.807692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.807759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.808355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.808381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.811218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.811276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.811328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.811379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.811912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.811986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.812039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.812091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.812155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.812683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.812705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.815730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.815800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.815853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.815935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.816457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.816537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.816593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.816645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.816698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.817265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.817287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.820114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.820171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.820678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.820737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.821209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.821282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.821337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.821391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.821447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.821994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.822019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.824875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.824941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.824994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.825489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.825981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.826058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.826111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.826183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.826247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.826827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.826851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.831397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.831909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.832414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.832911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.833395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.833903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.834410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.834906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.835402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.835748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.835770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.839609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.841574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.842620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.843122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.843618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.844138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.845983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.847951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.849911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.850410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.850431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.853064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.853566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.854279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.855995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.856345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.858345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.859496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.861222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.863177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.863530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.863550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.868307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.870263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.037 [2024-07-16 00:29:55.872231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.873411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.873827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.875814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.877777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.878284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.878773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.879348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.879371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.882718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.884443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.886405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.888254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.888745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.889262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.889765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.890913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.892638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.892992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.893013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.896748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.897263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.897763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.898264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.898649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.900379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.902251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.903802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.905648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.906004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.906026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.909135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.910982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.912895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.914844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.915277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.917142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.919088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.921032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.921997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.922561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.922584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.927042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.928894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.930578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.932430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.932778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.934549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.935053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.935549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.936056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.936439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.936460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.940410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.942362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.943319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.943815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.944302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.944810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.946653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.948604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.950548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.951043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.951065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.953580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.954083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.954581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.956312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.956656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.958629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.959810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.961514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.963465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.963812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.963833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.968669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.970519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.972482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.973641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.974006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.976032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.977992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.978904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.979403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.979889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.038 [2024-07-16 00:29:55.979910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.983520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.985294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.987259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.989218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.989756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.990280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.990777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.991277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.993013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.993357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.993377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.997413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.998011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.998507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.999008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:55.999519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.001246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.003204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.005146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.006304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.006694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.006715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.009583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.010177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.011902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.013867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.014220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.015409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.017132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.019084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.021036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.021505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.021532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.026187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.028148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.029306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.031027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.031372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.033362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.033863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.034361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.034878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.035364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.035386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.039125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.041108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.042955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.043454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.044010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.298 [2024-07-16 00:29:56.044515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.045538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.047257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.049204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.049549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.049569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.052159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.052661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.053163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.054878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.055261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.057244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.058979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.060818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.062705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.063063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.063084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.067076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.068798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.070712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.072348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.072697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.074429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.076310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.077799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.078302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.078811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.078832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.083089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.084822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.086671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.088625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.089025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.089539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.090040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.090536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.092371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.092718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.092738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.096678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.097909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.098415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.098909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.099458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.101193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.103023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.105026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.106208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.106611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.106632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.109619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.110194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.111914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.113873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.114224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.115418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.117151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.119110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.121079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.121638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.121659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.126237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.128203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.129380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.131119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.131463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.133441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.133947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.134441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.134954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.135454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.135474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.139312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.141264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.142957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.143458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.143995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.144525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.145788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.147527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.149440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.149786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.149807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.152304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.152810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.153310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.155135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.155485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.157373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.158975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.160809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.162807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.163161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.163182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.167502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.169442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.171101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.172663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.299 [2024-07-16 00:29:56.173017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.174906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.175410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.175908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.176412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.176855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.176876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.180666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.182669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.184515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.185021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.185573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.186087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.186590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.187095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.187613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.188125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.188147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.191545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.192050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.192551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.193059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.193592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.194106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.194607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.195112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.195609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.196154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.196176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.199800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.200314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.200810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.201312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.201922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.202434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.202935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.203431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.203942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.204518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.204542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.207930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.208432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.208935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.209436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.209986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.210489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.211016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.211514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.212018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.212558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.212580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.215907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.216417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.216912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.217416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.217876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.218391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.218882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.219384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.219884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.220434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.220456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.223806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.224312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.224390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.224879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.225332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.225842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.226369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.226872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.227371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.227915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.227943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.231117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.231620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.232127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.232187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.232530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.233069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.234046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.235119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.235622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.236109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.236131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.238699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.238756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.238808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.238861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.239412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.239475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.239527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.239591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.239647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.300 [2024-07-16 00:29:56.240196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.240217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.242695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.242752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.242803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.242855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.243408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.243474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.243527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.243580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.243643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.244155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.244177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.246759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.246815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.246867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.301 [2024-07-16 00:29:56.246919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.247469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.247531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.247585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.247650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.247703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.248214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.248236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.250753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.250809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.250862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.250914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.251442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.251505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.251558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.251610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.251674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.252220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.252243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.254815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.254872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.254935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.254988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.255524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.255591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.255645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.255698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.255761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.256281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.256302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.258846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.258913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.258971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.259023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.259554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.259617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.259670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.259722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.259777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.260252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.260274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.262874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.262944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.262997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.263048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.263584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.263647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.263701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.562 [2024-07-16 00:29:56.263754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.263834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.264327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.264348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.266810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.266878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.266951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.267003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.267559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.267621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.267674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.267726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.267779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.268270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.268292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.270810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.270875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.270942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.270995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.271545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.271611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.271665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.271721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.271774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.272220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.272241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.274725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.274781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.274846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.274900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.275476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.275540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.275594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.275646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.275707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.276161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.276182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.278679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.278735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.278796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.278859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.279460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.279523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.279576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.279628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.279679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.280203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.280225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.282815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.282870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.282932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.282999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.283593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.283656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.283710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.283762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.283814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.284249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.284269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.286721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.286778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.286830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.286884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.287331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.287402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.287459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.287511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.287562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.288133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.288155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.291062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.291119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.291188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.291250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.291833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.291897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.291957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.292009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.292065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.292556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.292577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.295560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.295617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.295673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.295728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.296104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.296177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.296228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.296280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.296331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.296672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.296692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.298768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.298829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.298892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.298954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.299386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.299457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.299509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.299580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.563 [2024-07-16 00:29:56.299644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.300237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.300263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.302780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.302836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.302888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.302953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.303296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.303371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.303428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.303487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.303542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.303951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.303972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.306245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.306302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.306355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.306407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.306887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.306966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.307020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.307072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.307125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.307662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.307686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.309805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.309866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.309918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.309976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.310365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.310435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.310487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.310539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.310596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.310942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.310964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.313771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.313832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.313889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.313949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.314295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.314367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.314433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.314488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.314541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.314885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.314905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.317050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.317109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.317170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.317221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.317616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.317691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.317748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.317823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.317884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.318489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.318511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.321049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.321109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.321160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.321212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.321556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.321624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.321677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.321728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.321783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.322240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.322261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.324564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.324621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.324673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.324726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.325233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.325307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.325359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.325411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.325462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.326006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.326028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.328080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.328151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.328207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.328258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.328651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.328721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.328773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.328828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.328880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.329224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.329245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.332042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.332099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.332157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.332210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.332570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.332639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.332699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.332752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.564 [2024-07-16 00:29:56.332808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.333156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.333177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.335236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.335292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.335344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.335396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.335737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.335808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.335868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.335921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.335980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.336494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.336515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.339174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.339230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.339296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.339351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.339697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.339765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.339817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.339868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.339919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.340432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.340453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.342555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.342612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.342667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.342720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.343256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.343319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.343388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.343453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.343506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.344087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.344109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.346240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.346296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.346374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.346426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.346801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.346876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.346934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.346985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.347036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.347376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.347397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.350036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.350093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.350152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.350204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.350648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.350713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.350765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.350816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.350867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.351263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.351284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.353385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.353441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.355400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.355458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.355975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.356050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.356102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.356154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.356206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.356746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.356768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.359378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.359434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.359485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.361262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.361695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.361763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.361815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.361873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.361933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.362274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.362295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.365441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.367285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.369235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.371203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.371695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.373548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.375503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.377457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.378329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.378896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.378917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.383389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.384577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.386312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.388266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.388610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.389499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.389999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.390492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.391216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.391626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.565 [2024-07-16 00:29:56.391647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.395754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.397713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.398223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.398712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.399262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.400005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.401727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.403694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.405653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.406133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.406153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.408946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.409439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.411064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.412880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.413231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.415061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.416866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.418709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.420666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.421073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.421095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.425741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.427746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.429099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.430825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.431174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.433160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.434282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.434769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.435266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.435806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.435828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.439662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.441625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.443593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.444098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.444651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.445157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.446158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.447872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.449837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.450187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.450208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.452808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.453311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.453803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.455640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.455988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.457854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.459353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.461190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.463148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.463490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.463510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.467536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.469273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.471159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.472763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.473115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.474847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.476800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.478232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.478754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.479249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.479271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.482863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.484600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.486569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.488539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.489006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.489513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.490011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.490659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.492373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.492717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.492737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.496796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.497311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.497804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.566 [2024-07-16 00:29:56.498297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.567 [2024-07-16 00:29:56.498729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.567 [2024-07-16 00:29:56.500467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.567 [2024-07-16 00:29:56.502414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.567 [2024-07-16 00:29:56.504256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.567 [2024-07-16 00:29:56.505911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.567 [2024-07-16 00:29:56.506336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.567 [2024-07-16 00:29:56.506357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.567 [2024-07-16 00:29:56.509468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.511171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.513013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.514968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.515381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.517237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.519114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.521116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.521999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.522559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.522580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.527071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.528258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.529992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.531957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.532302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.532981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.533474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.533971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.535170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.535547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.535568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.539473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.540999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.541493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.541989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.542543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.544222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.546063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.548022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.549264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.549607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.549628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.552548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.553093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.554816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.556777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.557125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.558320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.560050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.562004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.563965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.564482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.564503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.569031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.828 [2024-07-16 00:29:56.570213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.571933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.573756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.574293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.576035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.576529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.578225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.578725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.579186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.579207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.583051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.584994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.586408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.586917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.587402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.587907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.589670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.591511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.593490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.593909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.593938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.596522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.597026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.597519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.598018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.598574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.599087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.599580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.600084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.600582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.601140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.601162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.604532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.605037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.605536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.606057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.606604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.607114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.607631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.608130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.608627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.609235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.609258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.612585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.613091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.613587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.614110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.614640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.615153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.615667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.616167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.616667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.617279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.617301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.620712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.621222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.621718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.622241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.622850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.623369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.623862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.624361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.624863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.625446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.625468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.628863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.629372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.629870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.630375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.630951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.631454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.631953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.632447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.632953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.633544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.633566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.637001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.637500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.638000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.638499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.639076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.639581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.640081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.640589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.641094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.641668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.641690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.645208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.645724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.646230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.646727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.647347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.647854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.648353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.648847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.649350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.649903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.649924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.653432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.829 [2024-07-16 00:29:56.653947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.654443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.654946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.655533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.656046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.656539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.657041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.657554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.658089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.658112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.661631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.662147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.662643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.663146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.663759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.664272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.664766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.665264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.665765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.666303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.666325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.669873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.670375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.670871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.671371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.671980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.672485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.672982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.673477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.673999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.674533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.674555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.678135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.678635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.679131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.679627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.680203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.680728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.681224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.681714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.682212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.682683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.682705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.687487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.689456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.690610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.692335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.692676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.694660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.695492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.695990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.696492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.697025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.697046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.700141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.702099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.704036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.704914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.705483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.705993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.706486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.708216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.710182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.710523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.710543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.713190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.713691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.714201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.715558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.715981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.717967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.719855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.721382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.723102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.723446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.723467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.727812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.729703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.729765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.731722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.732208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.734062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.736028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.737987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.738761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.739344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.739367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.743839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.745015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.746740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.746797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.747150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.749115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.749614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.750110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.750607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.751011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.751033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.830 [2024-07-16 00:29:56.753169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.753230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.753282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.753333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.753672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.753742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.753794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.753846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.753897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.754238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.754259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.757447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.757502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.757553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.757617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.757963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.758037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.758094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.758146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.758197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.758534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.758555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.760744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.760808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.760861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.760913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.761448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.761513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.761566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.761617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.761669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.762158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.762180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.764469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.764525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.764576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.764627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.765085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.765156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.765208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.765259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.765310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.765682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.765702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.768276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.768337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.768389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.768441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.768990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.769060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.769113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.769165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.769220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.769580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.769601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.771808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.771865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.771916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.771972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.772313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.772386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.772442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.772493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.772544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.772991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.773013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.776028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.776092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.776145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.776197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.776532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.776604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.776655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.776707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.831 [2024-07-16 00:29:56.776758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.777106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.777132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.779300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.779357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.779434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.779487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.780071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.780140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.780195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.780251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.780303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.780876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.780898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.783110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.092 [2024-07-16 00:29:56.783175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.783230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.783282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.783626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.783696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.783757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.783811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.783868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.784215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.784236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.786872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.786935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.786988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.787041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.787521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.787589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.787640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.787697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.787748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.788123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.788145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.790215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.790271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.790322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.790373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.790707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.790778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.790830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.790881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.790946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.791514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.791534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.794258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.794314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.794369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.794429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.794766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.794854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.794908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.794964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.795016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.795507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.795528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.797735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.797791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.797843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.797896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.798439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.798516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.798568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.798621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.798673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.799235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.799257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.801416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.801476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.801528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.801580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.801971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.802042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.802094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.802145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.802196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.802533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.802554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.805422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.805479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.805531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.805583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.805920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.806000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.806055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.806112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.806164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.806507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.806528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.808697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.808752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.808821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.808876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.809239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.809310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.809363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.809415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.809469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.810065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.810087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.814980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.815041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.815093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.815143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.815516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.815592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.815644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.815695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.815762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.816106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.816127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.820735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.093 [2024-07-16 00:29:56.820797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.820848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.820899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.821240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.821310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.821362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.821415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.821474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.821882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.821902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.826668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.826729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.826783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.826838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.827198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.827290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.827344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.827395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.827446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.827785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.827805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.834238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.834300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.834357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.834410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.834920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.834990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.835043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.835096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.835148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.835699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.835721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.840624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.840687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.840754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.840806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.841191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.841268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.841323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.841396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.841455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.842050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.842073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.846961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.847026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.847076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.847128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.847506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.847576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.847628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.847689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.847745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.848094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.848128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.852636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.852700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.852751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.852802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.853146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.853219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.853271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.853329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.853380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.853795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.853817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.858609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.858673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.858745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.858801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.859192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.859264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.859322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.859374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.859426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.859763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.859784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.866213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.866275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.866329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.866381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.866895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.866972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.867025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.867078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.867130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.867672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.867693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.872621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.872685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.872746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.872797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.873184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.873257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.873311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.873377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.873430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.874027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.874049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.878922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.094 [2024-07-16 00:29:56.878991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.879042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.879093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.879482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.879553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.879605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.879662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.879717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.880062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.880083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.884671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.884733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.884788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.884839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.885183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.885253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.885309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.885369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.885421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.885821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.885841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.890645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.890707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.890763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.890814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.891212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.891289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.891341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.891392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.891443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.891780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.891800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.898204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.898272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.898325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.898376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.898904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.898973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.899026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.899080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.899132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.899678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.899699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.904614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.904677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.906413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.906469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.907047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.907113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.907167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.907219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.907271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.907774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.907796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.910107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.910166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.910217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.911372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.911750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.911823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.911880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.911940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.911994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.912333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.912360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.916566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.918532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.920310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.921760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.922110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.924102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.924601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.925097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.925589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.926002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.926024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.929967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.931941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.933028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.933519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.934026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.934527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.936273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.938261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.940215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.940734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.940755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.943424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.943930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.944421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.944921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.945504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.946015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.946507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.947012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.947511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.948090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.095 [2024-07-16 00:29:56.948112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.951655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.952164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.952671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.953172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.953782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.954292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.954785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.955285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.955782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.956338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.956362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.959942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.960441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.960939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.961438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.962033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.962539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.963039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.963540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.964056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.964595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.964616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.968254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.968756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.969255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.969771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.970358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.970870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.971369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.971866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.972375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.972943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.972965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.976601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.977104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.977603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.978104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.978680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.979193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.979689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.980193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.980688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.981242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.981265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.984862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.985370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.985881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.986396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.986948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.987453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.987954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.988451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.988958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.989484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.989506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.992960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.993466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.993980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.994476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.995046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.995551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.996053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.996551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.997053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.997555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:56.997577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.001033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.001533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.002042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.002537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.003105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.003625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.004125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.004621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.005125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.005642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.096 [2024-07-16 00:29:57.005663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.077296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.077385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.078568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.085520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.087291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.087685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.088083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.088594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.088967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.089021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.089396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.089454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.090450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.090506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.091684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.091968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.091985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.092001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.092016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.094891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.096654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.098358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.099874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.100606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.101007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.101396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.101784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.102236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.102256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.102272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.102288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.108837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.110073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.111607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.113141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.114109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.114509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.114898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.115290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.115724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.115742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.115758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.115777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.119004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.120760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.121458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.122963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.124764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.126289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.127854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.128249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.128721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.128740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.128756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.128772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.133876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.135421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.136531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.137922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.356 [2024-07-16 00:29:57.140013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.141658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.143183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.144009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.144445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.144464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.144479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.144494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.148386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.149605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.151139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.152795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.153944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.155602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.156825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.158363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.158639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.158656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.158671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.158686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.163748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.165368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.166982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.168514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.170432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.171308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.172640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.174395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.174670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.174686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.174701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.174715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.177058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.177454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.177848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.179335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.181449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.183164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.184693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.185432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.185708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.185725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.185740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.185754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.190761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.191164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.191555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.192013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.193487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.195026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.196576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.198336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.198783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.198800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.198814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.198829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.202096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.202501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.202890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.203293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.204115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.204715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.206330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.207916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.208197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.208214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.208228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.208242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.214335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.215704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.216104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.216494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.217340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.217729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.218973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.220191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.220467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.220484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.220499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.220513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.223368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.225024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.226779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.228438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.229242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.229630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.230029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.230419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.230882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.230900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.230915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.230937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.237665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.239025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.240555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.242073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.242737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.243134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.243524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.243910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.244359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.244378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.357 [2024-07-16 00:29:57.244393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.244408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.247545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.248961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.250047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.251264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.253071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.254601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.255802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.256197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.256649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.256666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.256682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.256697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.262756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.264536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.265009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.266757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.268590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.270129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.271880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.272415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.272691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.272709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.272723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.272738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.275313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.275705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.277157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.278376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.280172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.280842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.282606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.283842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.284130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.284148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.284162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.284177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.288583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.289338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.290791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.292554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.294051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.295168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.296386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.298063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.298342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.298358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.298373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.298387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.300704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.301106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.301498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.301890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.302704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.303104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.303491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.303879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.304325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.304343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.304359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.358 [2024-07-16 00:29:57.304374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.307949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.308349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.308750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.309155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.309964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.310356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.310744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.311140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.311484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.311501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.311516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.311531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.314292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.314686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.315091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.619 [2024-07-16 00:29:57.315488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.316374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.316764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.316812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.317208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.317634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.317652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.317667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.317683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.321166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.321222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.321613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.322016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.322752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.323154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.323543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.323937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.324403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.324422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.324438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.324453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.327180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.327575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.327971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.328365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.329250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.329645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.330041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.330428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.330877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.330894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.330910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.330933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.334343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.334741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.335140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.335536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.336316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.336707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.337100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.337490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.337881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.338302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.338321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.338336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.338350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.338364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.341259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.341662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.342067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.342466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.342514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.343050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.343106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.343495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.343538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.343932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.344383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.344401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.344417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.344432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.344448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.348005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.348062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.348454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.348517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.348980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.349047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.349438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.349502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.349889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.350353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.350372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.350387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.350403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.350418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.353043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.353097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.353485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.353533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.353984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.354038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.354425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.354467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.354854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.355294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.355314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.620 [2024-07-16 00:29:57.355329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.355344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.355358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.358825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.358881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.359283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.359333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.359676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.359741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.360135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.360178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.360565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.361000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.361020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.361035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.361050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.361064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.363856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.363909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.364305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.364349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.364773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.364835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.365231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.365276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.365662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.366099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.366117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.366132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.366147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.366161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.369907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.369972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.370361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.370410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.370791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.370844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.371239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.371283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.371667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.372192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.372211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.372227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.372242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.372257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.374883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.374945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.375331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.375379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.375907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.375970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.376359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.376402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.376796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.377196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.377215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.377229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.377244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.377257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.380953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.381038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.381435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.381481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.381990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.382045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.382433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.382477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.382876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.383334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.383355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.383371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.383385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.383400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.386128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.386180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.386566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.386611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.387062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.387116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.387508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.387564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.387968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.388420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.388441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.388455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.388471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.388485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.391560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.391613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.391674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.391733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.392231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.621 [2024-07-16 00:29:57.392286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.392674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.392717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.392758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.393213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.393231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.393247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.393262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.393276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.395853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.395912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.395960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.396002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.396363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.396415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.396457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.396499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.396540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.396994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.397014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.397029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.397045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.397064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.400266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.400318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.400359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.400400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.400840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.400893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.400943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.400986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.401027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.401439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.401456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.401471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.401486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.401500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.403907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.403978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.404021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.404063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.404560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.404614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.404660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.404702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.404743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.405211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.405229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.405244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.405259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.405273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.409536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.409595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.409636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.409676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.410166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.410219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.410262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.410304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.410347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.410782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.410798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.410812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.410828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.410842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.413044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.413090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.413130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.413171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.413442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.413501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.413543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.413585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.413626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.414025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.414042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.414057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.414071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.414085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.418675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.418726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.418768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.418811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.419094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.419152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.419192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.419233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.419273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.419614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.419630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.419644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.622 [2024-07-16 00:29:57.419659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.419672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.421327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.421373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.421416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.421456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.421727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.421784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.421826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.421876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.421916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.422199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.422215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.422229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.422244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.422257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.426450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.426500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.426541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.426582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.427027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.427081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.427123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.427171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.427212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.427566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.427582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.427596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.427610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.427624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.429172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.429218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.429258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.429298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.429618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.429678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.429722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.429763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.429803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.430081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.430098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.430112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.430127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.430140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.435019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.435071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.435115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.435159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.435601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.435656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.435697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.435755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.435798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.436303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.436320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.436336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.436351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.436366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.438244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.438293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.438337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.438377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.438647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.438703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.438750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.438794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.438834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.439111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.439127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.439142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.439156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.439171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.443190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.443242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.443283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.443326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.443730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.443783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.443824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.443869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.443910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.444372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.444391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.444406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.444426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.444443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.446578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.446631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.446672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.446739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.447014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.447066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.447106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.447147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.623 [2024-07-16 00:29:57.447196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.447467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.447483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.447498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.447512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.447526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.451964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.452016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.452066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.452111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.452384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.452434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.452475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.452525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.452568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.452969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.452987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.453002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.453016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.453031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.455839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.455886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.455939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.455994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.456265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.456316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.456357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.456401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.456449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.456742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.456759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.456774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.456788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.456802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.460558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.460608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.460648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.460688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.460964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.461024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.461065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.461106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.461146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.461411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.461427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.461441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.461456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.461470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.463754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.463799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.463839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.463883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.464322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.464375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.464418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.464461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.464502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.464855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.464871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.464885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.464900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.464915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.469336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.469386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.469426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.469466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.469792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.469851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.469892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.469939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.469980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.624 [2024-07-16 00:29:57.470255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.470271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.470286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.470300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.470314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.472311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.472370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.473248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.473297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.473339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.473791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.473808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.473824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.473839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.478656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.480069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.480117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.480158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.480198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.480552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.482748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.482795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.482836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.482878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.483327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.501534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.507052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.508537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.508583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.516262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.516316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.516368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.516729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.516771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.517223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.520842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.522600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.522652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.524261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.524594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.526028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.526080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.526133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.526494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.526536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.527019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.527038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.527053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.531034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.532694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.532744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.534467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.535028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.536754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.536817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.536870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.538592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.538658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.538943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.538960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.538974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.538989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.539003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.543202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.543656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.543702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.545291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.547091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.547140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.548664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.548709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.548988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.549009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.549024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.549039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.549053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.553233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.553633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.553677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.554071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.554881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.554934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.555322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.555365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.555641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.555657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.555672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.555686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.555700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.625 [2024-07-16 00:29:57.560361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.561418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.561466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.562988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.565031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.565076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.565468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.565512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.566007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.566024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.566040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.566057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.626 [2024-07-16 00:29:57.566072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.571445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.572988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.573873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.575430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.577511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.577576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.579316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.581083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.581543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.581560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.581574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.581589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.581603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.585769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.587302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.588826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.590257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.592344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.593891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.595416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.596932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.597210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.597226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.597242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.597258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.597272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.601489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.603036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.604788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.606566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.608549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.609759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.611293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.612815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.613101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.613118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.613132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.613147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.613161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.618029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.619674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.621205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.622726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.623835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.625192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.626944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.628516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.628792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.628808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.628823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.628837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.628851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.634357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.635565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.637101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.638628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.639512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.641267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.642542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.644068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.644348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.644368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.644383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.644397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.644411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.649957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.651158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.652846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.654602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.656053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.657433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.658653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.660188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.660465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.660482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.660496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.660510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.660524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.665494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.667133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.668698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.670230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.672204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.672981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.674399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.676156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.676433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.676449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.676463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.676478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.676492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.681011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.682625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.683846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.885 [2024-07-16 00:29:57.685373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.687168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.687841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.689595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.690805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.691089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.691106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.691120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.691135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.691149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.695570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.696489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.697774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.699525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.701341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.702685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.703810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.705025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.705304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.705320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.705335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.705349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.705363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.709946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.710341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.712080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.713264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.715289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.716913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.717492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.719241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.719563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.719579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.719594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.719608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.719622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.724042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.724440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.725283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.726638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.728442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.729966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.731365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.732422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.732703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.732721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.732736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.732750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.732765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.737986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.738386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.738775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.740442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.742404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.744158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.745808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.746355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.746635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.746652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.746671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.746685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.746699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.752010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.752409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.752798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.753696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.755325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.756857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.758388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.759750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.760116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.760134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.760149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.760164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.760178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.765058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.765457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.765846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.886 [2024-07-16 00:29:57.766240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.767703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.769309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.771045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.772561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.772905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.772922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.772943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.772958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.772973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.778403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.778803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.779202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.779591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.781760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.782969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.784491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.786038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.786314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.786331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.786345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.786360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.786373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.790580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.790989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.791381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.791787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.792988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.794428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.796181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.797701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.797982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.797998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.798013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.798027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.798041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.803647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.804050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.804440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.804828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.805706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.806574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.808116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.809686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.809970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.809986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.810001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.810015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.810029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.814123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.814522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.814914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.815308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.816306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.817969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.818360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.820045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.820394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.820410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.820424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.820439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.820453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.825062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.825467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.825860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.826256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.827109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.827498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.827889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.828288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.828683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.828700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.828715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.828733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.828747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.832210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.832614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.833009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.833057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.833833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.833879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.887 [2024-07-16 00:29:57.834277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.148 [2024-07-16 00:29:57.834673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.148 [2024-07-16 00:29:57.835080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.148 [2024-07-16 00:29:57.835097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.148 [2024-07-16 00:29:57.835112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.148 [2024-07-16 00:29:57.835127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.148 [2024-07-16 00:29:57.835141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.148 [2024-07-16 00:29:57.838643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.148 [2024-07-16 00:29:57.839047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.148 [2024-07-16 00:29:57.839440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.148 [2024-07-16 00:29:57.839829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.840731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.840781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.841177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.841571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.841967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.841984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.841999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.842013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.842028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.845456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.845855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.846254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.846643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.847152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.847549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.847951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.848001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.848392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.848409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.848424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.848439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.848453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.851866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.852271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.852661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.853054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.853897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.854318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.854368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.854758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.855186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.855203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.855218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.855233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.855249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.859667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.860075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.861828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.861867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.863108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.863158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.863547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.863946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.864332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.864350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.864365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.864382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.864398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.867444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.867841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.867886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.868280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.868766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.869173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.869568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.869615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.870035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.870053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.870068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.870083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.870098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.873104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.873504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.873548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.873945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.874710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.875107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.875153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.875548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.875886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.875903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.875918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.875947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.875961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.879128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.879528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.879577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.879972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.880797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.880849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.881247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.881635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.882002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.882021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.882035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.882050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.882065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.149 [2024-07-16 00:29:57.885573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.885979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.886029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.886422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.886857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.887254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.887645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.887688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.888224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.888252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.888268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.888284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.888299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.891433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.891829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.891874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.892272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.893007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.893406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.893453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.893842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.894311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.894329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.894345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.894360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.894376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.897520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.897918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.897970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.898375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.898870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.899273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.899330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.899722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.900191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.900208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.900222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.900238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.900253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.903304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.903711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.903759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.904153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.904630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.905028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.905072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.905463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.905907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.905924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.905945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.905960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.905974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.909236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.909631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.909692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.909737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.910237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.910641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.910689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.911082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.911535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.911553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.911568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.911586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.911601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.914787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.914839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.914881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.914923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.915412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.915820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.915865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.915906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.916371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.916389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.916404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.916419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.916439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.919511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.919562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.919606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.919649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.920136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.920181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.920236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.920279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.920699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.920716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.920730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.920745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.920760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.924124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.924174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.150 [2024-07-16 00:29:57.924229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.924286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.924801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.924845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.924903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.924967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.925408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.925424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.925439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.925453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.925468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.928386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.928436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.928485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.928528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.929007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.929066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.929136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.929181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.929572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.929590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.929604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.929619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.929633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.932698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.932760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.932802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.932844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.933249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.933293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.933334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.933373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.933814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.933832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.933847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.933862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.933877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.938187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.938237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.938278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.938318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.938637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.938679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.938720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.938760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.939048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.939066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.939081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.939096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.939111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.943156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.943206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.943247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.943288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.943779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.943823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.943865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.943906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.944354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.944371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.944387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.944402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.944416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.948974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.953228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.953278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.953320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.953360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.953859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.953919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.953968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.954009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.954514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.954532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.954547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.954563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.954578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.958339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.958396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.958440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.958480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.958782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.958824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.958873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.958916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.959194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.151 [2024-07-16 00:29:57.959211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.959225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.959239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.959253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.962980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.963038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.963079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.963121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.963527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.963570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.963611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.963651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.963962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.963979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.963994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.964009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.964023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.966960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.967932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.972837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.972888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.972936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.972977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.973291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.973334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.973375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.973415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.973758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.973780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.973795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.973809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.973824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.976819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.976869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.976919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.976968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.977283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.977327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.977368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.977408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.977681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.977698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.977712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.977726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.977740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.982209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.982260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.982309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.982356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.982663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.982707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.982747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.982797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.983249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.983266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.983280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.983295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.983309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.988828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.988885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.988933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.988976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.989376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.989418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.989462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.989503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.989832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.989849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.989863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.989877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.989891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.993742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.993802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.993843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.993885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.994206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.994250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.994290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.994332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.994605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.994622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.994636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.994650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.994664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.998952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.152 [2024-07-16 00:29:57.999005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:57.999868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.004490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.004546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.004587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.004627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.004935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.004986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.005032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.005072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.005342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.005358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.005373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.005387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.005401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.008343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.008397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.008787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.008830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.009293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.011039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.011088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.011130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.011406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.011423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.011441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.011455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.011469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.015860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.015919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.015971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.016012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.016322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.017909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.017961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.018001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.018445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.018462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.018476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.018492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.018506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.021454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.021504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.021545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.021586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.023495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.023541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.023582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.025321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.025745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.025762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.025777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.025792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.025806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.030762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.030817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.030858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.030891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.031224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.031269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.031658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.031702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.032156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.032175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.032191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.032205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.032219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.037411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.037491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.153 [2024-07-16 00:29:58.037535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.039243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.040993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.041487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.041566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.043319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.043369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.043411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.043687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.043704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.043718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.043733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.043747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.047713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.048116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.048161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.048549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.048828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.050056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.050103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.050143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.051651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.051924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.051946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.051960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.051975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.051990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.056980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.058525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.058570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.059531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.059808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.059869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.059911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.060307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.060351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.060807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.060828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.060843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.060858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.060873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.065458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.066456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.066505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.067529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.067914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.067973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.068366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.068411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.068454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.068878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.068895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.068910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.068931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.068948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.072603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.074365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.074420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.075273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.075548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.077200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.077246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.077287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.079012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.079289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.079306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.079320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.079335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.079350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.082384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.082868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.082914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.083308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.083645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.083699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.083740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.084942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.084989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.085263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.085280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.085294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.085309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.085323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.090514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.092130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.092177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.093862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.094336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.094735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.094780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.095175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.095219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.095695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.095712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.095728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.095743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.154 [2024-07-16 00:29:58.095758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.099521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.100567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.100616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.102262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.102600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.104192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.104241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.105942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.105992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.106265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.106282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.106301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.106316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.106330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.108985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.109380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.109771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.111426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.111757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.113352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.113402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.115049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.115095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.115368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.115385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.115399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.115414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.115428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.119488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.119887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.120282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.120670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.121117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.121776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.121825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.123199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.124953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.125228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.125244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.125259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.125273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.125287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.131188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.132281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.133530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.134501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.134970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.135368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.137024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.137608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.138004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.138390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.138407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.138421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.138436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.138450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.144329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.146083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.147676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.149201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.414 [2024-07-16 00:29:58.149553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.149967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.150355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.150741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.151135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.151594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.151612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.151627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.151642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.151657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.158364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.159577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.161117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.162653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.162938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.163380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.165129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.165521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.165907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.166255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.166273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.166288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.166302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.166316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.173103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.174864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.175472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.177062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.177337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.178883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.180417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.181985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.182376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.182860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.182877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.182893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.182908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.182924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.188318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.189856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.190826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.192340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.192681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.194383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.196134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.197686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.198332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.198608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.198624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.198639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.198653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.198667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.204756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.205968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.207492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.209201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.209476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.210344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.211975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.213198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.214725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.215006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.215023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.215038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.215052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.215067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.220447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.221838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.223594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.225152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.225428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.226788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.227939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.229154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.230768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.231053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.231070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.231084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.231099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.231113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.237321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.237722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.238116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.239299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.239611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.241361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.242886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.244413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.245450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.245760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.245777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.245791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.245805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.245819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.250788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.251190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.251581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.252005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.252280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.253485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.255016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.256549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.258298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.258744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.258766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.258780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.258795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.258808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.265195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.265596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.265991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.267370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.267682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.268097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.268487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.270234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.271502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.271776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.271792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.271807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.271821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.271835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.274735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.276277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.277807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.279306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.279745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.280153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.280546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.280941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.281331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.281684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.281700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.281715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.281729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.281748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.284650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.285753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.286976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.288649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.288931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.290469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.291544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.292803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.293764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.294239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.294257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.294273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.294288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.294306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.297324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.298625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.300375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.301996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.302273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.303585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.304839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.306060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.307585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.307859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.307875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.307889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.415 [2024-07-16 00:29:58.307904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.307918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.310483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.310880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.311455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.313073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.313348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.314875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.316404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.318043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.318790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.319071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.319088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.319102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.319117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.319131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.321093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.322815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.323209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.323598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.323906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.325313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.325709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.326102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.327822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.328176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.328193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.328208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.328222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.328236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.331641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.332858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.334388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.335916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.336200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.336649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.337044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.337432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.337819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.338286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.338303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.338319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.338334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.338349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.341333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.342456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.343677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.345348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.345621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.347164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.348275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.349512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.350495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.350966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.350983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.350998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.351014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.351029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.353910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.355350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.357102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.358632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.358982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.360175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.361392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.416 [2024-07-16 00:29:58.362977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.364713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.364996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.365013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.365027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.365042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.365056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.367754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.367803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.368203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.368613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.369025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.370058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.370108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.371125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.371515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.371974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.371992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.372007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.372021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.372038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.374576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.374982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.375372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.375761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.376173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.376572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.376620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.377016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.377426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.377815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.377832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.377852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.377867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.377881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.381638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.382040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.382430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.382822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.383241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.383306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.383711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.384107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.384153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.384607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.384624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.384639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.384655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.384671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.387440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.389198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.389627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.390021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.390379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.391981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.392376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.392421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.677 [2024-07-16 00:29:58.392807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.393162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.393179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.393195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.393213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.393227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.396040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.396433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.396479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.396521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.396870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.397284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.397343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.397905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.399563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.400108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.400127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.400142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.400158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.400175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.402699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.402759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.403170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.403219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.403621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.403684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.404079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.404470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.404514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.404911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.404933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.404948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.404962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.404976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.408143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.408204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.409293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.409337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.409792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.410196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.411826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.411873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.412309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.412753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.412771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.412787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.412802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.412816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.415553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.415604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.415998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.416043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.416507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.416909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.416976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.418723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.419148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.419425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.419442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.419456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.419471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.419485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.421805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.421871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.423622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.423668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.424165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.424224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.424614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.425739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.425785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.426126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.426143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.426157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.426172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.426186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.429052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.429102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.430624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.430668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.430945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.431351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.431741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.431785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.432178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.432679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.432696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.432710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.432726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.432741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.436758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.436807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.437205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.437253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.437706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.437762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.438838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.438884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.439867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.440338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.440356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.440371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.440390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.440405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.443022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.443073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.443460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.443503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.443983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.444039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.444426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.444481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.444869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.445217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.445235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.445249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.445264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.445279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.448819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.448872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.448913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.448961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.449457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.449513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.449903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.449953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.450348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.450846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.450863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.450878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.450892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.450906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.453426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.453475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.453516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.453558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.454013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.454065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.454458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.454505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.454548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.454958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.454974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.454989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.455004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.455018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.457419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.457465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.457507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.457554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.457827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.457881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.457922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.457969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.458010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.458450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.458467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.458487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.458501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.458515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.460858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.460906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.460956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.678 [2024-07-16 00:29:58.460997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.461481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.461533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.461575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.461617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.461658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.462091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.462108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.462122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.462138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.462152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.464596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.464644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.464686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.464726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.465084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.465140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.465181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.465222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.465262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.465557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.465574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.465589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.465604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.465623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.467765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.467813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.467855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.467899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.468352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.468404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.468446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.468489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.468531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.468873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.468890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.468904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.468919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.468940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.471365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.471412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.471453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.471494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.471992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.472046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.472088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.472129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.472171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.472608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.472626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.472640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.472655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.472668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.474564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.474608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.474653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.474694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.475127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.475179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.475221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.475263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.475304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.475679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.475695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.475709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.475725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.475739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.477889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.477944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.477986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.478028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.478398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.478462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.478505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.478550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.478593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.479027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.479045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.479059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.479074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.479088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.481668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.481725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.481765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.481821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.482306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.482396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.482451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.482495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.482556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.483027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.483045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.483059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.483073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.483088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.485152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.485198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.485238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.485279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.485555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.485612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.485653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.485694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.485734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.486175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.486196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.486211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.486227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.486242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.488471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.488517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.488560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.488602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.489069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.489121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.489164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.489210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.489252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.489719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.489737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.489752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.489767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.489781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.492134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.492185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.492227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.492268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.492539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.492589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.492640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.492684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.492725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.493226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.493244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.493259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.493275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.493290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.495389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.495437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.495479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.495521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.495936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.496002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.496058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.496100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.496142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.496529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.496546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.496561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.496576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.496590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.499002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.499049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.499091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.499134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.499568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.499620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.499662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.499705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.499747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.500191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.500209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.500224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.679 [2024-07-16 00:29:58.500239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.500254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.502272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.502319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.502361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.502405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.502851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.502920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.502979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.503023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.503064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.503343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.503360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.503374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.503392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.503406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.505933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.505992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.506033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.506075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.506488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.506539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.506580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.506625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.506666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.507111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.507129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.507144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.507160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.507175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.509490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.509536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.509584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.509654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.510028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.510094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.510137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.510178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.510226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.510501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.510517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.510532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.510546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.510560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.512638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.512690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.512731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.512772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.513256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.513309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.513351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.513393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.513435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.513870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.513888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.513902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.513916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.513938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.516189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.516237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.516278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.516319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.516700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.516750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.516791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.516833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.516874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.517325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.517344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.517360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.517375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.517390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.519373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.520581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.520630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.520674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.521045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.521103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.521493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.521537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.521578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.521964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.521982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.521996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.522011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.522025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.524150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.524195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.524235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.524275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.524546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.524607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.526131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.526176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.526217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.526673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.526691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.526705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.526720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.526733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.528571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.528618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.528659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.528701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.529151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.530231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.530282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.530324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.531128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.531572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.531590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.531606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.531621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.531636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.533319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.533363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.533404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.533444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.533717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.533775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.533816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.535066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.535109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.535429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.535445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.535459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.535474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.535489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.537058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.537103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.537980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.539429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.539765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.539826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.540223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.540268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.540313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.540721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.540738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.680 [2024-07-16 00:29:58.540752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.540766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.540780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.542874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.544097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.544145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.545677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.545961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.547505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.547554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.547595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.548077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.548354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.548370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.548385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.548399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.548413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.549994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.550455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.550501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.552249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.552726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.552785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.552828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.553222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.553266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.553562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.553578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.553596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.553611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.553625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.555673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.556896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.556950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.558471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.558744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.558805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.560485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.560529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.560571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.560944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.560961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.560976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.560990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.561004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.562548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.563863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.563908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.565087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.565364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.565770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.565817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.565862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.566258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.566530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.566547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.566561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.566576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.566593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.568702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.570348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.570394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.571931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.572205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.572266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.572308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.573360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.573407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.573680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.573696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.573710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.573725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.573739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.575321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.576200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.576246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.577903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.578299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.578704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.578750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.579337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.579384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.579657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.579673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.579687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.579702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.579716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.581918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.583452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.583504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.585112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.585387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.586185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.586244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.588001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.588052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.588353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.588369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.588384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.588399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.588413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.590074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.591821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.592225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.592613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.592941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.594425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.594473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.594857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.594900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.595352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.595370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.595385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.595401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.595415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.598473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.599108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.600859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.601933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.602210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.603956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.604007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.605754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.606153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.606431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.606448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.606462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.606477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.606491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.608960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.609353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.610731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.611949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.612223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.613875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.615413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.616217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.617910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.618261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.618278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.618292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.618307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.618320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.621327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.622112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.622504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.623139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.623412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.623899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.681 [2024-07-16 00:29:58.624297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.625380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.626596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.626870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.626886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.626901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.626916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.626936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.629808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.631567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.633305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.634823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.635169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.636311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.637383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.637771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.638166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.638439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.638455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.638469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.638484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.638498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.642219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.643754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.645284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.646376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.646708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.648016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.649768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.651414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.652947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.653284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.653306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.653321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.653336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.653350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.657101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.657633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.658030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.659051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.659328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.660882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.662404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.663921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.665121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.665466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.665483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.665497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.665512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.665525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.668362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.669351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.670589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.670991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.671457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.672783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.673679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.674075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.674730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.941 [2024-07-16 00:29:58.675012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.675030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.675044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.675063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.675077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.678479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.678877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.680582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.681107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.681548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.682002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.683753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.684152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.684540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.684906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.684923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.684944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.684959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.684973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.687153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.688892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.689290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.689679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.690000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.691416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.691808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.692202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.693961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.694243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.694260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.694274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.694289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.694303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.697657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.699326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.700850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.702389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.702664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.703400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.704889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.705284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.705672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.705961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.705978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.705993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.706007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.706021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.709260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.710791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.712325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.713944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.714355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.716125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.717562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.719097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.720617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.720895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.720912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.720934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.720950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.720965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.724217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.725338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.725729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.726213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.726488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.727718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.729263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.730792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.732485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.732974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.732991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.733005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.733020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.733034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.736309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.736892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.738519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.738912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.739370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.740293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.741592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.741988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.742378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.742653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.742670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.742685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.742700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.742714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.745090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.746760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.747979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.749502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.749776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.751446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.751973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.753721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.942 [2024-07-16 00:29:58.754157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.754607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.754624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.754640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.754659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.754673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.758157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.759367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.760885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.762634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.762909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.763791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.765392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.766616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.768153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.768429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.768447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.768461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.768476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.768490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.770668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.771260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.772893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.773293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.773739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.774855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.776092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.777731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.779493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.779768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.779785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.779799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.779814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.779828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.783279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.784826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.785468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.787141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.787519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.787935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.788359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.790111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.790508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.790974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.790995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.791011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.791026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.791040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.794127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.795308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.796578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.797791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.798074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.799632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.801150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.802081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.803460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.803776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.803794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.803812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.803827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.803841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.806318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.807161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.808528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.810119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.810394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.811947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.813471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.814385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.815667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.815949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.815967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.815982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.815997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.816011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.819208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.819259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.819649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.820044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.820317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.821385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.821431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.821819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.822532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.822807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.822825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.822839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.822854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.822868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.824867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.826505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.828106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.829632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.829909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.831539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.831586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.832458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.833800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.943 [2024-07-16 00:29:58.834303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.834322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.834337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.834353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.834368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.836845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.838605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.840053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.841587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.841865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.841939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.843473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.844398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.844446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.844734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.844751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.844766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.844782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.844796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.846660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.847063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.847458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.849213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.849649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.850062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.850781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.850828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.852137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.852418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.852435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.852449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.852464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.852478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.855513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.857281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.857330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.857381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.857655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.859198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.859245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.860089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.860476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.860935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.860956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.860971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.860987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.861002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.864395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.864448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.865661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.865708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.865985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.866052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.867798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.869545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.869588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.870089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.870107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.870121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.870136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.870150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.873415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.873462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.873857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.873904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.874404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.874806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.876115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.876161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.876906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.877349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.877367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.877385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.877401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.877416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.880565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.880615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.882240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.882284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.882706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.884362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.884406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.886170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.887685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.887968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.887985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.887999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.888014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.944 [2024-07-16 00:29:58.888028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.891950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.892004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.892399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.892443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.892875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.892938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.894128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.895351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.895400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.895674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.895691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.895706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.895723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.895737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.898584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.898639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.900178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.900224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.900499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.902150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.902702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.902751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.903159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.903610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.903634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.903649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.903666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.903682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.907490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.907547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.909035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.909080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.909355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.909415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.910942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.910988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.912199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.912528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.912545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.912559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.912574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.912588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.915526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.915576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.915970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.916016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.916451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.916504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.917070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.917115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.918596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.919100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.919117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.919133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.919154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.919170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.922555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.922606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.922647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.922695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.922975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.923030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.923696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.923744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.925497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.925834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.925850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.925864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.925879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.925893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.927559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.927610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.927650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.927691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.928180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.928244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.928632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.928678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.928720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.929002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.929019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.929034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.929048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.929063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.931276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.931328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.931369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.931409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.931683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.931745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.931792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.931832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.931872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.932150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.932167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.932181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.932196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.932209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.933923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.933974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.208 [2024-07-16 00:29:58.934846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.937085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.937131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.937171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.937211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.937488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.937553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.937595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.937636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.937677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.938174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.938192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.938210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.938225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.938240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.940942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.942576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.942621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.942661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.942701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.942977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.943036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.943081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.943123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.943163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.943548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.943565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.943579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.943594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.943608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.945775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.945819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.945860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.945900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.946344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.946397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.946439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.946481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.946522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.946949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.946966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.946981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.946995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.947009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.948520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.948565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.948606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.948647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.948915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.948975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.949017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.949060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.949101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.949433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.949449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.949463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.949478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.949492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.951165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.951216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.951259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.951301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.951788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.951841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.951883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.951946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.951989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.952417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.952436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.952451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.952465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.952479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.954581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.954625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.954665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.954706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.955065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.955144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.955190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.955231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.955273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.955616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.955633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.955654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.955668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.955682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.957577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.957626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.957668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.957709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.958157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.958210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.958254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.958311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.958357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.958625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.958641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.958655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.958670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.958684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.961028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.961076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.961130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.961207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.961683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.961740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.961785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.961829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.961870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.962352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.962371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.962387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.962402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.962418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.964559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.964605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.964646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.964691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.965049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.965113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.965155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.965198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.965240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.965605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.209 [2024-07-16 00:29:58.965621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.965636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.965651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.965666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.967757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.967801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.967850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.967891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.968390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.968442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.968485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.968527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.968569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.969015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.969035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.969051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.969066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.969080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.971243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.971289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.971335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.971377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.971712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.971767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.971808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.971849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.971890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.972176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.972193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.972208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.972223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.972238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.974514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.974560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.974602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.974643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.975056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.975112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.975154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.975198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.975241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.975682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.975699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.975714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.975730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.975745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.977834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.977891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.977936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.977993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.978459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.978539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.978594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.978636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.978690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.979164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.979181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.979195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.979209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.979223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.981584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.981629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.981673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.981715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.982168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.982220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.982263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.982305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.982346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.982713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.982730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.982745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.982759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.982774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.985218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.985266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.985306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.985347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.985619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.985678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.985722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.985767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.985807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.986307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.986325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.986340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.986355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.986372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.988515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.988909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.988959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.989001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.989341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.989395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.990789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.990832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.990873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.991344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.991365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.991380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.991396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.991413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.993545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.993589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.993630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.993670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.994104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.994159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.995274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.995320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.995360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.995703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.995723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.995738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.995752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.995767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.998078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.998124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.998165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.998210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.998658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.999071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.999116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:58.999159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.000827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.001236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.001253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.001269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.001284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.001298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.003655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.003700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.003757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.003803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.004329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.004385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.004428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.004818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.004863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.005147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.005164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.005178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.005193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.005211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.007524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.007579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.009323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.011010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.011412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.011476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.012714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.012762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.012802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.013082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.013099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.013113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.013128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.210 [2024-07-16 00:29:59.013142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.015050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.015681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.015726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.017151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.017688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.018093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.018138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.018181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.019634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.020002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.020019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.020033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.020048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.020062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.021767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.022988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.023034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.024551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.024827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.024889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.024936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.025326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.025372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.025849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.025866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.025881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.025896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.025911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.028044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.028445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.028494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.028884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.029281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.029340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.029729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.029773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.029815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.030110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.030127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.030141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.030156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.030170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.032376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.032782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.032831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.033225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.033665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.034864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.034912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.034958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.035644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.036069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.036088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.036103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.036118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.036133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.038375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.038767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.038829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.040548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.040982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.041042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.041084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.041474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.041518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.041906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.041923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.041942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.041957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.041971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.044383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.046134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.046177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.046567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.047016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.047417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.047480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.047877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.047947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.048405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.048422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.048436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.048451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.048464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.050785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.051188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.051232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.051623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.052073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.052476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.052534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.052924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.052974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.053349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.053365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.053379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.053394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.053408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.055523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.055915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.056329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.056723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.057127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.058885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.058935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.059327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.059372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.059829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.059846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.059861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.059877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.059892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.062552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.062954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.063343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.063734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.064195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.064603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.064652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.065045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.065438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.065942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.065961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.065976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.065992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.066007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.068515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.068910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.069309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.211 [2024-07-16 00:29:59.069702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.070123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.070530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.070924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.071321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.071712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.072162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.072180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.072200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.072216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.072231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.074894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.075298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.075687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.076083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.076529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.076941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.077334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.077725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.078120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.078573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.078590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.078605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.078619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.078634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.081216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.081614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.082011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.082409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.082805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.083217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.083614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.084012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.084399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.084736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.084754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.084769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.084784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.084799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.087403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.087802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.088224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.088628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.089067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.089471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.089867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.090262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.091227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.091503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.091520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.091535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.091550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.091565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.094262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.094658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.095062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.095456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.095879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.096288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.096681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.097075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.098824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.099167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.099184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.099199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.099213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.099228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.102353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.103896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.105422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.106621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.107034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.107440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.107844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.108240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.109745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.110099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.110116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.110130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.110145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.110159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.113035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.114244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.115767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.117491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.117769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.118662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.119060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.119454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.119846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.120293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.120310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.120325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.120339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.120353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.123606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.124413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.125811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.127562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.127839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.129392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.130739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.131132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.131526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.132037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.132054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.132069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.132085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.132100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.135605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.137144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.137776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.139529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.139859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.141412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.143057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.144800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.145196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.145731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.145747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.145763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.145779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.145794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.149543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.151108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.212 [2024-07-16 00:29:59.152636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.153527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.153806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.155034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.156664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.158414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.160053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.160454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.160471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.160486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.160500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.160514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.164575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.166141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.167642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.169161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.169439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.170296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.171624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.173374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.175016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.175296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.175313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.175327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.175341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.175356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.178190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.179752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.181393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.182921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.183206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.184756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.185690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.186957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.188709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.188996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.189018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.189033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.189047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.189061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.191344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.191842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.193541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.195050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.195328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.196879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.198553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.199338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.200750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.201034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.201051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.201066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.201081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.201095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.203224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.203628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.204116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.205816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.206112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.207647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.209187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.210830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.211653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.211945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.211962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.524 [2024-07-16 00:29:59.211977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.211994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.212012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.213854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.214256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.214660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.215057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.215334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.216551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.218073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.219729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.221486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.221975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.221992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.222006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.222021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.222035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.226332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.226647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.525 [2024-07-16 00:29:59.226664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.089 00:35:13.089 Latency(us) 00:35:13.089 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:13.089 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:13.089 Verification LBA range: start 0x0 length 0x100 00:35:13.089 crypto_ram : 6.07 42.17 2.64 0.00 0.00 2950501.95 320955.44 2392576.89 00:35:13.089 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:13.089 Verification LBA range: start 0x100 length 0x100 00:35:13.089 crypto_ram : 6.00 32.99 2.06 0.00 0.00 3553248.52 99386.77 3005310.00 00:35:13.089 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:13.089 Verification LBA range: start 0x0 length 0x100 00:35:13.089 crypto_ram1 : 6.07 42.16 2.64 0.00 0.00 2854320.75 320955.44 2202921.41 00:35:13.089 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:13.089 Verification LBA range: start 0x100 length 0x100 00:35:13.089 crypto_ram1 : 6.07 37.55 2.35 0.00 0.00 3097211.36 108504.82 2757298.98 00:35:13.089 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:13.089 Verification LBA range: start 0x0 length 0x100 00:35:13.089 crypto_ram2 : 5.60 266.06 16.63 0.00 0.00 431548.23 60179.14 568966.46 00:35:13.089 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:13.089 Verification LBA range: start 0x100 length 0x100 00:35:13.089 crypto_ram2 : 5.68 216.50 13.53 0.00 0.00 518526.25 34192.70 638263.65 00:35:13.089 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:13.089 Verification LBA range: start 0x0 length 0x100 00:35:13.089 crypto_ram3 : 5.68 273.85 17.12 0.00 0.00 406777.09 12708.29 455902.61 00:35:13.089 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:13.089 Verification LBA range: start 0x100 length 0x100 00:35:13.089 crypto_ram3 : 5.84 233.41 14.59 0.00 0.00 468655.51 18236.10 620027.55 00:35:13.089 =================================================================================================================== 00:35:13.089 Total : 1144.68 71.54 0.00 0.00 828437.06 12708.29 3005310.00 00:35:13.347 00:35:13.347 real 0m9.258s 00:35:13.347 user 0m17.546s 00:35:13.347 sys 0m0.474s 00:35:13.347 00:30:00 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:13.347 00:30:00 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:35:13.347 ************************************ 00:35:13.347 END TEST bdev_verify_big_io 00:35:13.347 ************************************ 00:35:13.606 00:30:00 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:13.606 00:30:00 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:13.606 00:30:00 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:13.606 00:30:00 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:13.606 00:30:00 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:13.606 ************************************ 00:35:13.606 START TEST bdev_write_zeroes 00:35:13.606 ************************************ 00:35:13.606 00:30:00 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:13.606 [2024-07-16 00:30:00.488947] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:13.606 [2024-07-16 00:30:00.489076] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3700738 ] 00:35:13.865 [2024-07-16 00:30:00.687589] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:13.865 [2024-07-16 00:30:00.791209] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:13.865 [2024-07-16 00:30:00.812514] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:14.123 [2024-07-16 00:30:00.820542] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:14.123 [2024-07-16 00:30:00.828560] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:14.123 [2024-07-16 00:30:00.935136] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:16.655 [2024-07-16 00:30:03.157481] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:16.655 [2024-07-16 00:30:03.157545] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:16.655 [2024-07-16 00:30:03.157561] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:16.655 [2024-07-16 00:30:03.165499] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:16.655 [2024-07-16 00:30:03.165518] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:16.655 [2024-07-16 00:30:03.165530] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:16.655 [2024-07-16 00:30:03.173520] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:16.655 [2024-07-16 00:30:03.173537] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:16.655 [2024-07-16 00:30:03.173553] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:16.655 [2024-07-16 00:30:03.181540] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:16.655 [2024-07-16 00:30:03.181557] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:16.655 [2024-07-16 00:30:03.181569] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:16.655 Running I/O for 1 seconds... 00:35:17.590 00:35:17.590 Latency(us) 00:35:17.590 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:17.590 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:17.590 crypto_ram : 1.02 2006.56 7.84 0.00 0.00 63303.62 5670.29 76591.64 00:35:17.590 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:17.590 crypto_ram1 : 1.03 2019.68 7.89 0.00 0.00 62586.23 5613.30 70664.90 00:35:17.590 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:17.590 crypto_ram2 : 1.02 15448.23 60.34 0.00 0.00 8156.73 2464.72 10713.71 00:35:17.590 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:17.590 crypto_ram3 : 1.02 15480.28 60.47 0.00 0.00 8113.74 2450.48 8491.19 00:35:17.590 =================================================================================================================== 00:35:17.590 Total : 34954.75 136.54 0.00 0.00 14476.25 2450.48 76591.64 00:35:17.849 00:35:17.849 real 0m4.345s 00:35:17.849 user 0m3.844s 00:35:17.849 sys 0m0.454s 00:35:17.849 00:30:04 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:17.849 00:30:04 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:35:17.849 ************************************ 00:35:17.849 END TEST bdev_write_zeroes 00:35:17.849 ************************************ 00:35:17.849 00:30:04 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:17.849 00:30:04 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:17.849 00:30:04 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:17.849 00:30:04 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:17.849 00:30:04 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:18.108 ************************************ 00:35:18.108 START TEST bdev_json_nonenclosed 00:35:18.108 ************************************ 00:35:18.108 00:30:04 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:18.108 [2024-07-16 00:30:04.868595] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:18.108 [2024-07-16 00:30:04.868660] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3701502 ] 00:35:18.108 [2024-07-16 00:30:04.995504] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:18.367 [2024-07-16 00:30:05.103890] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:18.367 [2024-07-16 00:30:05.103965] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:35:18.367 [2024-07-16 00:30:05.103987] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:18.367 [2024-07-16 00:30:05.104000] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:18.367 00:35:18.367 real 0m0.403s 00:35:18.367 user 0m0.241s 00:35:18.367 sys 0m0.159s 00:35:18.367 00:30:05 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:35:18.367 00:30:05 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:18.367 00:30:05 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:35:18.367 ************************************ 00:35:18.367 END TEST bdev_json_nonenclosed 00:35:18.367 ************************************ 00:35:18.367 00:30:05 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:18.367 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:35:18.367 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:18.367 00:30:05 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:18.367 00:30:05 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:18.367 00:30:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:18.367 ************************************ 00:35:18.367 START TEST bdev_json_nonarray 00:35:18.367 ************************************ 00:35:18.367 00:30:05 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:18.626 [2024-07-16 00:30:05.345011] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:18.626 [2024-07-16 00:30:05.345074] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3701604 ] 00:35:18.626 [2024-07-16 00:30:05.463937] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:18.626 [2024-07-16 00:30:05.564970] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:18.626 [2024-07-16 00:30:05.565049] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:35:18.626 [2024-07-16 00:30:05.565071] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:18.626 [2024-07-16 00:30:05.565085] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:18.885 00:35:18.885 real 0m0.375s 00:35:18.885 user 0m0.222s 00:35:18.885 sys 0m0.150s 00:35:18.885 00:30:05 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:35:18.885 00:30:05 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:18.885 00:30:05 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:35:18.885 ************************************ 00:35:18.885 END TEST bdev_json_nonarray 00:35:18.885 ************************************ 00:35:18.885 00:30:05 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:35:18.885 00:30:05 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:35:18.885 00:35:18.885 real 1m13.907s 00:35:18.885 user 2m42.716s 00:35:18.885 sys 0m9.857s 00:35:18.885 00:30:05 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:18.885 00:30:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:18.885 ************************************ 00:35:18.885 END TEST blockdev_crypto_qat 00:35:18.885 ************************************ 00:35:18.885 00:30:05 -- common/autotest_common.sh@1142 -- # return 0 00:35:18.885 00:30:05 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:18.885 00:30:05 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:35:18.885 00:30:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:18.885 00:30:05 -- common/autotest_common.sh@10 -- # set +x 00:35:18.885 ************************************ 00:35:18.885 START TEST chaining 00:35:18.885 ************************************ 00:35:18.885 00:30:05 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:19.145 * Looking for test storage... 00:35:19.145 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:19.145 00:30:05 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@7 -- # uname -s 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:19.145 00:30:05 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:35:19.145 00:30:05 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:19.145 00:30:05 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:19.145 00:30:05 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:19.145 00:30:05 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:19.145 00:30:05 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:19.145 00:30:05 chaining -- paths/export.sh@5 -- # export PATH 00:35:19.145 00:30:05 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@47 -- # : 0 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:35:19.145 00:30:05 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:35:19.145 00:30:05 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:35:19.145 00:30:05 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:35:19.145 00:30:05 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:35:19.145 00:30:05 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:35:19.145 00:30:05 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:19.145 00:30:05 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:19.145 00:30:05 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:19.145 00:30:05 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:19.145 00:30:05 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:27.266 00:30:13 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@336 -- # return 1 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:27.267 WARNING: No supported devices were found, fallback requested for tcp test 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:27.267 Cannot find device "nvmf_tgt_br" 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@155 -- # true 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:27.267 Cannot find device "nvmf_tgt_br2" 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@156 -- # true 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:27.267 Cannot find device "nvmf_tgt_br" 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@158 -- # true 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:27.267 Cannot find device "nvmf_tgt_br2" 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@159 -- # true 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:27.267 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@162 -- # true 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:27.267 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@163 -- # true 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:27.267 00:30:13 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:27.267 00:30:14 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:27.267 00:30:14 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:27.267 00:30:14 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:27.267 00:30:14 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:27.267 00:30:14 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:27.267 00:30:14 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:27.267 00:30:14 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:27.267 00:30:14 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:27.267 00:30:14 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:27.267 00:30:14 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:27.527 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:27.527 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.113 ms 00:35:27.527 00:35:27.527 --- 10.0.0.2 ping statistics --- 00:35:27.527 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:27.527 rtt min/avg/max/mdev = 0.113/0.113/0.113/0.000 ms 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:27.527 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:27.527 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.079 ms 00:35:27.527 00:35:27.527 --- 10.0.0.3 ping statistics --- 00:35:27.527 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:27.527 rtt min/avg/max/mdev = 0.079/0.079/0.079/0.000 ms 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:27.527 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:27.527 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.040 ms 00:35:27.527 00:35:27.527 --- 10.0.0.1 ping statistics --- 00:35:27.527 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:27.527 rtt min/avg/max/mdev = 0.040/0.040/0.040/0.000 ms 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@433 -- # return 0 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:27.527 00:30:14 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:27.787 00:30:14 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:35:27.787 00:30:14 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:27.787 00:30:14 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:27.787 00:30:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:27.787 00:30:14 chaining -- nvmf/common.sh@481 -- # nvmfpid=3705634 00:35:27.787 00:30:14 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:27.787 00:30:14 chaining -- nvmf/common.sh@482 -- # waitforlisten 3705634 00:35:27.787 00:30:14 chaining -- common/autotest_common.sh@829 -- # '[' -z 3705634 ']' 00:35:27.787 00:30:14 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:27.787 00:30:14 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:27.787 00:30:14 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:27.787 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:27.787 00:30:14 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:27.787 00:30:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:27.787 [2024-07-16 00:30:14.566262] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:27.787 [2024-07-16 00:30:14.566335] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:27.787 [2024-07-16 00:30:14.711401] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:28.046 [2024-07-16 00:30:14.848456] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:28.046 [2024-07-16 00:30:14.848521] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:28.046 [2024-07-16 00:30:14.848540] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:28.046 [2024-07-16 00:30:14.848557] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:28.046 [2024-07-16 00:30:14.848571] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:28.046 [2024-07-16 00:30:14.848619] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:28.615 00:30:15 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:28.615 00:30:15 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:28.615 00:30:15 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:28.615 00:30:15 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:28.615 00:30:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.615 00:30:15 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:28.615 00:30:15 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:28.615 00:30:15 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.nfxx6uVQ2T 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.y1V4JoyVVr 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:35:28.872 00:30:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.872 00:30:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.872 malloc0 00:35:28.872 true 00:35:28.872 true 00:35:28.872 [2024-07-16 00:30:15.612324] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:28.872 crypto0 00:35:28.872 [2024-07-16 00:30:15.620357] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:28.872 crypto1 00:35:28.872 [2024-07-16 00:30:15.628504] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:28.872 [2024-07-16 00:30:15.644854] tcp.c: 981:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:28.872 00:30:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@85 -- # update_stats 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:28.872 00:30:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:28.872 00:30:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.872 00:30:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:28.872 00:30:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.872 00:30:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.872 00:30:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:35:28.872 00:30:15 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.873 00:30:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.873 00:30:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:28.873 00:30:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.873 00:30:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:28.873 00:30:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.873 00:30:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.129 00:30:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.129 00:30:15 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:29.129 00:30:15 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.nfxx6uVQ2T bs=1K count=64 00:35:29.129 64+0 records in 00:35:29.129 64+0 records out 00:35:29.129 65536 bytes (66 kB, 64 KiB) copied, 0.0010582 s, 61.9 MB/s 00:35:29.129 00:30:15 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.nfxx6uVQ2T --ob Nvme0n1 --bs 65536 --count 1 00:35:29.129 00:30:15 chaining -- bdev/chaining.sh@25 -- # local config 00:35:29.129 00:30:15 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:29.129 00:30:15 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:29.129 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:29.129 00:30:15 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:29.129 "subsystems": [ 00:35:29.129 { 00:35:29.129 "subsystem": "bdev", 00:35:29.129 "config": [ 00:35:29.129 { 00:35:29.129 "method": "bdev_nvme_attach_controller", 00:35:29.129 "params": { 00:35:29.129 "trtype": "tcp", 00:35:29.129 "adrfam": "IPv4", 00:35:29.129 "name": "Nvme0", 00:35:29.129 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:29.129 "traddr": "10.0.0.2", 00:35:29.129 "trsvcid": "4420" 00:35:29.129 } 00:35:29.129 }, 00:35:29.129 { 00:35:29.129 "method": "bdev_set_options", 00:35:29.129 "params": { 00:35:29.129 "bdev_auto_examine": false 00:35:29.129 } 00:35:29.129 } 00:35:29.129 ] 00:35:29.129 } 00:35:29.129 ] 00:35:29.129 }' 00:35:29.129 00:30:15 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.nfxx6uVQ2T --ob Nvme0n1 --bs 65536 --count 1 00:35:29.129 00:30:15 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:29.129 "subsystems": [ 00:35:29.129 { 00:35:29.129 "subsystem": "bdev", 00:35:29.130 "config": [ 00:35:29.130 { 00:35:29.130 "method": "bdev_nvme_attach_controller", 00:35:29.130 "params": { 00:35:29.130 "trtype": "tcp", 00:35:29.130 "adrfam": "IPv4", 00:35:29.130 "name": "Nvme0", 00:35:29.130 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:29.130 "traddr": "10.0.0.2", 00:35:29.130 "trsvcid": "4420" 00:35:29.130 } 00:35:29.130 }, 00:35:29.130 { 00:35:29.130 "method": "bdev_set_options", 00:35:29.130 "params": { 00:35:29.130 "bdev_auto_examine": false 00:35:29.130 } 00:35:29.130 } 00:35:29.130 ] 00:35:29.130 } 00:35:29.130 ] 00:35:29.130 }' 00:35:29.130 [2024-07-16 00:30:15.964904] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:29.130 [2024-07-16 00:30:15.964982] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3705858 ] 00:35:29.388 [2024-07-16 00:30:16.098977] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:29.388 [2024-07-16 00:30:16.200659] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:29.905  Copying: 64/64 [kB] (average 20 MBps) 00:35:29.905 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@96 -- # update_stats 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.905 00:30:16 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.905 00:30:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:30.165 00:30:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:30.165 00:30:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.165 00:30:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:30.165 00:30:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:30.165 00:30:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.165 00:30:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:30.165 00:30:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:30.165 00:30:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.165 00:30:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.y1V4JoyVVr --ib Nvme0n1 --bs 65536 --count 1 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@25 -- # local config 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:30.165 00:30:16 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:30.165 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:30.165 00:30:17 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:30.165 "subsystems": [ 00:35:30.165 { 00:35:30.165 "subsystem": "bdev", 00:35:30.165 "config": [ 00:35:30.165 { 00:35:30.165 "method": "bdev_nvme_attach_controller", 00:35:30.165 "params": { 00:35:30.165 "trtype": "tcp", 00:35:30.165 "adrfam": "IPv4", 00:35:30.165 "name": "Nvme0", 00:35:30.165 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:30.165 "traddr": "10.0.0.2", 00:35:30.165 "trsvcid": "4420" 00:35:30.165 } 00:35:30.165 }, 00:35:30.165 { 00:35:30.165 "method": "bdev_set_options", 00:35:30.165 "params": { 00:35:30.165 "bdev_auto_examine": false 00:35:30.165 } 00:35:30.165 } 00:35:30.165 ] 00:35:30.165 } 00:35:30.165 ] 00:35:30.165 }' 00:35:30.165 00:30:17 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.y1V4JoyVVr --ib Nvme0n1 --bs 65536 --count 1 00:35:30.165 00:30:17 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:30.165 "subsystems": [ 00:35:30.165 { 00:35:30.165 "subsystem": "bdev", 00:35:30.165 "config": [ 00:35:30.165 { 00:35:30.165 "method": "bdev_nvme_attach_controller", 00:35:30.165 "params": { 00:35:30.165 "trtype": "tcp", 00:35:30.165 "adrfam": "IPv4", 00:35:30.165 "name": "Nvme0", 00:35:30.165 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:30.165 "traddr": "10.0.0.2", 00:35:30.165 "trsvcid": "4420" 00:35:30.165 } 00:35:30.165 }, 00:35:30.165 { 00:35:30.165 "method": "bdev_set_options", 00:35:30.165 "params": { 00:35:30.165 "bdev_auto_examine": false 00:35:30.165 } 00:35:30.165 } 00:35:30.165 ] 00:35:30.165 } 00:35:30.165 ] 00:35:30.165 }' 00:35:30.165 [2024-07-16 00:30:17.104854] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:30.165 [2024-07-16 00:30:17.104937] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3705994 ] 00:35:30.424 [2024-07-16 00:30:17.235924] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:30.424 [2024-07-16 00:30:17.336489] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:30.941  Copying: 64/64 [kB] (average 10 MBps) 00:35:30.941 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:30.941 00:30:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:30.941 00:30:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.941 00:30:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:30.941 00:30:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:30.941 00:30:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.941 00:30:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:30.941 00:30:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:30.941 00:30:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:30.941 00:30:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.941 00:30:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:31.200 00:30:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:31.200 00:30:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:31.200 00:30:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.nfxx6uVQ2T /tmp/tmp.y1V4JoyVVr 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@25 -- # local config 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:31.200 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:31.200 "subsystems": [ 00:35:31.200 { 00:35:31.200 "subsystem": "bdev", 00:35:31.200 "config": [ 00:35:31.200 { 00:35:31.200 "method": "bdev_nvme_attach_controller", 00:35:31.200 "params": { 00:35:31.200 "trtype": "tcp", 00:35:31.200 "adrfam": "IPv4", 00:35:31.200 "name": "Nvme0", 00:35:31.200 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:31.200 "traddr": "10.0.0.2", 00:35:31.200 "trsvcid": "4420" 00:35:31.200 } 00:35:31.200 }, 00:35:31.200 { 00:35:31.200 "method": "bdev_set_options", 00:35:31.200 "params": { 00:35:31.200 "bdev_auto_examine": false 00:35:31.200 } 00:35:31.200 } 00:35:31.200 ] 00:35:31.200 } 00:35:31.200 ] 00:35:31.200 }' 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:35:31.200 00:30:17 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:31.200 "subsystems": [ 00:35:31.200 { 00:35:31.200 "subsystem": "bdev", 00:35:31.200 "config": [ 00:35:31.200 { 00:35:31.200 "method": "bdev_nvme_attach_controller", 00:35:31.200 "params": { 00:35:31.200 "trtype": "tcp", 00:35:31.200 "adrfam": "IPv4", 00:35:31.200 "name": "Nvme0", 00:35:31.200 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:31.200 "traddr": "10.0.0.2", 00:35:31.200 "trsvcid": "4420" 00:35:31.200 } 00:35:31.200 }, 00:35:31.200 { 00:35:31.200 "method": "bdev_set_options", 00:35:31.200 "params": { 00:35:31.200 "bdev_auto_examine": false 00:35:31.200 } 00:35:31.200 } 00:35:31.200 ] 00:35:31.200 } 00:35:31.200 ] 00:35:31.200 }' 00:35:31.200 [2024-07-16 00:30:18.057034] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:31.200 [2024-07-16 00:30:18.057099] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3706105 ] 00:35:31.461 [2024-07-16 00:30:18.186401] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:31.461 [2024-07-16 00:30:18.282934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:31.977  Copying: 64/64 [kB] (average 12 MBps) 00:35:31.977 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@106 -- # update_stats 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:31.977 00:30:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.nfxx6uVQ2T --ob Nvme0n1 --bs 4096 --count 16 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@25 -- # local config 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:31.977 00:30:18 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:31.977 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:32.235 00:30:18 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:32.235 "subsystems": [ 00:35:32.235 { 00:35:32.235 "subsystem": "bdev", 00:35:32.235 "config": [ 00:35:32.235 { 00:35:32.235 "method": "bdev_nvme_attach_controller", 00:35:32.235 "params": { 00:35:32.235 "trtype": "tcp", 00:35:32.235 "adrfam": "IPv4", 00:35:32.235 "name": "Nvme0", 00:35:32.235 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:32.235 "traddr": "10.0.0.2", 00:35:32.235 "trsvcid": "4420" 00:35:32.235 } 00:35:32.235 }, 00:35:32.235 { 00:35:32.235 "method": "bdev_set_options", 00:35:32.235 "params": { 00:35:32.235 "bdev_auto_examine": false 00:35:32.235 } 00:35:32.235 } 00:35:32.235 ] 00:35:32.235 } 00:35:32.235 ] 00:35:32.235 }' 00:35:32.235 00:30:18 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.nfxx6uVQ2T --ob Nvme0n1 --bs 4096 --count 16 00:35:32.235 00:30:18 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:32.235 "subsystems": [ 00:35:32.235 { 00:35:32.235 "subsystem": "bdev", 00:35:32.235 "config": [ 00:35:32.235 { 00:35:32.235 "method": "bdev_nvme_attach_controller", 00:35:32.235 "params": { 00:35:32.235 "trtype": "tcp", 00:35:32.235 "adrfam": "IPv4", 00:35:32.235 "name": "Nvme0", 00:35:32.235 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:32.235 "traddr": "10.0.0.2", 00:35:32.235 "trsvcid": "4420" 00:35:32.235 } 00:35:32.235 }, 00:35:32.235 { 00:35:32.235 "method": "bdev_set_options", 00:35:32.235 "params": { 00:35:32.235 "bdev_auto_examine": false 00:35:32.235 } 00:35:32.235 } 00:35:32.235 ] 00:35:32.235 } 00:35:32.235 ] 00:35:32.235 }' 00:35:32.235 [2024-07-16 00:30:19.025674] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:32.235 [2024-07-16 00:30:19.025744] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3706298 ] 00:35:32.235 [2024-07-16 00:30:19.156468] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:32.493 [2024-07-16 00:30:19.261064] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:32.749  Copying: 64/64 [kB] (average 10 MBps) 00:35:32.749 00:35:32.749 00:30:19 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:35:32.749 00:30:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:32.749 00:30:19 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:32.749 00:30:19 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:32.749 00:30:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:32.749 00:30:19 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:32.749 00:30:19 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:32.749 00:30:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:32.749 00:30:19 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:32.749 00:30:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@114 -- # update_stats 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.007 00:30:19 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:33.007 00:30:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.267 00:30:19 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:35:33.267 00:30:19 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:33.267 00:30:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.267 00:30:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.267 00:30:19 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:33.267 00:30:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.267 00:30:19 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:33.267 00:30:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:33.267 00:30:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:33.267 00:30:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.267 00:30:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.267 00:30:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:33.267 00:30:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.267 00:30:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:33.267 00:30:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:33.267 00:30:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.267 00:30:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:33.267 00:30:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@117 -- # : 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.y1V4JoyVVr --ib Nvme0n1 --bs 4096 --count 16 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@25 -- # local config 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:33.267 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:33.267 00:30:20 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:33.267 "subsystems": [ 00:35:33.267 { 00:35:33.267 "subsystem": "bdev", 00:35:33.267 "config": [ 00:35:33.267 { 00:35:33.267 "method": "bdev_nvme_attach_controller", 00:35:33.268 "params": { 00:35:33.268 "trtype": "tcp", 00:35:33.268 "adrfam": "IPv4", 00:35:33.268 "name": "Nvme0", 00:35:33.268 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:33.268 "traddr": "10.0.0.2", 00:35:33.268 "trsvcid": "4420" 00:35:33.268 } 00:35:33.268 }, 00:35:33.268 { 00:35:33.268 "method": "bdev_set_options", 00:35:33.268 "params": { 00:35:33.268 "bdev_auto_examine": false 00:35:33.268 } 00:35:33.268 } 00:35:33.268 ] 00:35:33.268 } 00:35:33.268 ] 00:35:33.268 }' 00:35:33.268 00:30:20 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.y1V4JoyVVr --ib Nvme0n1 --bs 4096 --count 16 00:35:33.268 00:30:20 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:33.268 "subsystems": [ 00:35:33.268 { 00:35:33.268 "subsystem": "bdev", 00:35:33.268 "config": [ 00:35:33.268 { 00:35:33.268 "method": "bdev_nvme_attach_controller", 00:35:33.268 "params": { 00:35:33.268 "trtype": "tcp", 00:35:33.268 "adrfam": "IPv4", 00:35:33.268 "name": "Nvme0", 00:35:33.268 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:33.268 "traddr": "10.0.0.2", 00:35:33.268 "trsvcid": "4420" 00:35:33.268 } 00:35:33.268 }, 00:35:33.268 { 00:35:33.268 "method": "bdev_set_options", 00:35:33.268 "params": { 00:35:33.268 "bdev_auto_examine": false 00:35:33.268 } 00:35:33.268 } 00:35:33.268 ] 00:35:33.268 } 00:35:33.268 ] 00:35:33.268 }' 00:35:33.555 [2024-07-16 00:30:20.233949] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:33.555 [2024-07-16 00:30:20.234016] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3706506 ] 00:35:33.555 [2024-07-16 00:30:20.364579] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:33.555 [2024-07-16 00:30:20.461555] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:34.071  Copying: 64/64 [kB] (average 1361 kBps) 00:35:34.071 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:34.071 00:30:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:34.071 00:30:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:34.071 00:30:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:34.071 00:30:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:34.071 00:30:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:34.071 00:30:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:34.071 00:30:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.nfxx6uVQ2T /tmp/tmp.y1V4JoyVVr 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.nfxx6uVQ2T /tmp/tmp.y1V4JoyVVr 00:35:34.329 00:30:21 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@117 -- # sync 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@120 -- # set +e 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:34.329 rmmod nvme_tcp 00:35:34.329 rmmod nvme_fabrics 00:35:34.329 rmmod nvme_keyring 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@124 -- # set -e 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@125 -- # return 0 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@489 -- # '[' -n 3705634 ']' 00:35:34.329 00:30:21 chaining -- nvmf/common.sh@490 -- # killprocess 3705634 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@948 -- # '[' -z 3705634 ']' 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@952 -- # kill -0 3705634 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@953 -- # uname 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3705634 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3705634' 00:35:34.329 killing process with pid 3705634 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@967 -- # kill 3705634 00:35:34.329 00:30:21 chaining -- common/autotest_common.sh@972 -- # wait 3705634 00:35:34.900 00:30:21 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:34.900 00:30:21 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:34.900 00:30:21 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:34.900 00:30:21 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:34.900 00:30:21 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:34.900 00:30:21 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:34.900 00:30:21 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:34.900 00:30:21 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:34.900 00:30:21 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:35:34.900 00:30:21 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:34.900 00:30:21 chaining -- bdev/chaining.sh@132 -- # bperfpid=3706727 00:35:34.900 00:30:21 chaining -- bdev/chaining.sh@134 -- # waitforlisten 3706727 00:35:34.900 00:30:21 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:34.900 00:30:21 chaining -- common/autotest_common.sh@829 -- # '[' -z 3706727 ']' 00:35:34.900 00:30:21 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:34.900 00:30:21 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:34.900 00:30:21 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:34.900 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:34.900 00:30:21 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:34.900 00:30:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:34.900 [2024-07-16 00:30:21.716963] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:34.900 [2024-07-16 00:30:21.717031] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3706727 ] 00:35:34.900 [2024-07-16 00:30:21.847759] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:35.157 [2024-07-16 00:30:21.956998] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:35.722 00:30:22 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:35.722 00:30:22 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:35.722 00:30:22 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:35:35.722 00:30:22 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:35.722 00:30:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:35.979 malloc0 00:35:35.979 true 00:35:35.979 true 00:35:35.979 [2024-07-16 00:30:22.798686] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:35.979 crypto0 00:35:35.979 [2024-07-16 00:30:22.806712] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:35.979 crypto1 00:35:35.979 00:30:22 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:35.979 00:30:22 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:35.979 Running I/O for 5 seconds... 00:35:41.261 00:35:41.261 Latency(us) 00:35:41.261 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:41.261 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:41.261 Verification LBA range: start 0x0 length 0x2000 00:35:41.261 crypto1 : 5.02 11433.26 44.66 0.00 0.00 22329.58 6468.12 14189.97 00:35:41.261 =================================================================================================================== 00:35:41.261 Total : 11433.26 44.66 0.00 0.00 22329.58 6468.12 14189.97 00:35:41.261 0 00:35:41.261 00:30:27 chaining -- bdev/chaining.sh@146 -- # killprocess 3706727 00:35:41.261 00:30:27 chaining -- common/autotest_common.sh@948 -- # '[' -z 3706727 ']' 00:35:41.261 00:30:27 chaining -- common/autotest_common.sh@952 -- # kill -0 3706727 00:35:41.261 00:30:27 chaining -- common/autotest_common.sh@953 -- # uname 00:35:41.261 00:30:27 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:41.261 00:30:27 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3706727 00:35:41.261 00:30:28 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:41.261 00:30:28 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:41.261 00:30:28 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3706727' 00:35:41.261 killing process with pid 3706727 00:35:41.261 00:30:28 chaining -- common/autotest_common.sh@967 -- # kill 3706727 00:35:41.261 Received shutdown signal, test time was about 5.000000 seconds 00:35:41.261 00:35:41.261 Latency(us) 00:35:41.261 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:41.261 =================================================================================================================== 00:35:41.261 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:41.261 00:30:28 chaining -- common/autotest_common.sh@972 -- # wait 3706727 00:35:41.520 00:30:28 chaining -- bdev/chaining.sh@152 -- # bperfpid=3707522 00:35:41.520 00:30:28 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:41.520 00:30:28 chaining -- bdev/chaining.sh@154 -- # waitforlisten 3707522 00:35:41.520 00:30:28 chaining -- common/autotest_common.sh@829 -- # '[' -z 3707522 ']' 00:35:41.520 00:30:28 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:41.520 00:30:28 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:41.520 00:30:28 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:41.520 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:41.520 00:30:28 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:41.520 00:30:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:41.520 [2024-07-16 00:30:28.310519] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:41.520 [2024-07-16 00:30:28.310590] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3707522 ] 00:35:41.520 [2024-07-16 00:30:28.438467] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:41.778 [2024-07-16 00:30:28.545790] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:42.344 00:30:29 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:42.344 00:30:29 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:42.344 00:30:29 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:35:42.344 00:30:29 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:42.344 00:30:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:42.601 malloc0 00:35:42.601 true 00:35:42.601 true 00:35:42.601 [2024-07-16 00:30:29.371981] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:35:42.601 [2024-07-16 00:30:29.372030] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:42.601 [2024-07-16 00:30:29.372058] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc3d730 00:35:42.601 [2024-07-16 00:30:29.372076] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:42.601 [2024-07-16 00:30:29.373246] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:42.602 [2024-07-16 00:30:29.373276] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:35:42.602 pt0 00:35:42.602 [2024-07-16 00:30:29.380013] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:42.602 crypto0 00:35:42.602 [2024-07-16 00:30:29.388030] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:42.602 crypto1 00:35:42.602 00:30:29 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:42.602 00:30:29 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:42.602 Running I/O for 5 seconds... 00:35:47.871 00:35:47.871 Latency(us) 00:35:47.871 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:47.871 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:47.871 Verification LBA range: start 0x0 length 0x2000 00:35:47.871 crypto1 : 5.02 8914.15 34.82 0.00 0.00 28629.87 6553.60 17210.32 00:35:47.871 =================================================================================================================== 00:35:47.871 Total : 8914.15 34.82 0.00 0.00 28629.87 6553.60 17210.32 00:35:47.871 0 00:35:47.871 00:30:34 chaining -- bdev/chaining.sh@167 -- # killprocess 3707522 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@948 -- # '[' -z 3707522 ']' 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@952 -- # kill -0 3707522 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@953 -- # uname 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3707522 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3707522' 00:35:47.871 killing process with pid 3707522 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@967 -- # kill 3707522 00:35:47.871 Received shutdown signal, test time was about 5.000000 seconds 00:35:47.871 00:35:47.871 Latency(us) 00:35:47.871 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:47.871 =================================================================================================================== 00:35:47.871 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@972 -- # wait 3707522 00:35:47.871 00:30:34 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:35:47.871 00:30:34 chaining -- bdev/chaining.sh@170 -- # killprocess 3707522 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@948 -- # '[' -z 3707522 ']' 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@952 -- # kill -0 3707522 00:35:47.871 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (3707522) - No such process 00:35:47.871 00:30:34 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 3707522 is not found' 00:35:47.871 Process with pid 3707522 is not found 00:35:47.871 00:30:34 chaining -- bdev/chaining.sh@171 -- # wait 3707522 00:35:47.871 00:30:34 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:35:47.871 00:30:34 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:47.872 00:30:34 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:47.872 00:30:34 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:47.872 00:30:34 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:47.872 00:30:34 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:47.872 00:30:34 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:47.872 00:30:34 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:47.872 00:30:34 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:48.133 00:30:34 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:48.133 00:30:34 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:48.133 00:30:34 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:48.134 00:30:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@336 -- # return 1 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:48.134 WARNING: No supported devices were found, fallback requested for tcp test 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:48.134 Cannot find device "nvmf_tgt_br" 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@155 -- # true 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:48.134 Cannot find device "nvmf_tgt_br2" 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@156 -- # true 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:48.134 Cannot find device "nvmf_tgt_br" 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@158 -- # true 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:48.134 Cannot find device "nvmf_tgt_br2" 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@159 -- # true 00:35:48.134 00:30:34 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:48.134 00:30:35 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:48.134 00:30:35 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:48.134 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:48.134 00:30:35 chaining -- nvmf/common.sh@162 -- # true 00:35:48.134 00:30:35 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:48.134 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:48.134 00:30:35 chaining -- nvmf/common.sh@163 -- # true 00:35:48.134 00:30:35 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:48.134 00:30:35 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:48.134 00:30:35 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:48.134 00:30:35 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:48.411 00:30:35 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:48.670 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:48.670 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.148 ms 00:35:48.670 00:35:48.670 --- 10.0.0.2 ping statistics --- 00:35:48.670 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:48.670 rtt min/avg/max/mdev = 0.148/0.148/0.148/0.000 ms 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:48.670 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:48.670 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.080 ms 00:35:48.670 00:35:48.670 --- 10.0.0.3 ping statistics --- 00:35:48.670 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:48.670 rtt min/avg/max/mdev = 0.080/0.080/0.080/0.000 ms 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:48.670 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:48.670 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.059 ms 00:35:48.670 00:35:48.670 --- 10.0.0.1 ping statistics --- 00:35:48.670 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:48.670 rtt min/avg/max/mdev = 0.059/0.059/0.059/0.000 ms 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@433 -- # return 0 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:48.670 00:30:35 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:48.928 00:30:35 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:35:48.928 00:30:35 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:48.928 00:30:35 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:48.928 00:30:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:48.928 00:30:35 chaining -- nvmf/common.sh@481 -- # nvmfpid=3708735 00:35:48.928 00:30:35 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:48.928 00:30:35 chaining -- nvmf/common.sh@482 -- # waitforlisten 3708735 00:35:48.928 00:30:35 chaining -- common/autotest_common.sh@829 -- # '[' -z 3708735 ']' 00:35:48.928 00:30:35 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:48.928 00:30:35 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:48.928 00:30:35 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:48.928 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:48.928 00:30:35 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:48.928 00:30:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:48.928 [2024-07-16 00:30:35.740313] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:48.928 [2024-07-16 00:30:35.740390] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:49.188 [2024-07-16 00:30:35.883095] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:49.188 [2024-07-16 00:30:36.022702] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:49.188 [2024-07-16 00:30:36.022762] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:49.188 [2024-07-16 00:30:36.022780] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:49.188 [2024-07-16 00:30:36.022797] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:49.188 [2024-07-16 00:30:36.022811] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:49.188 [2024-07-16 00:30:36.022847] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:49.446 00:30:36 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:49.446 00:30:36 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:49.446 00:30:36 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:49.446 00:30:36 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:49.446 00:30:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:49.446 00:30:36 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:49.446 00:30:36 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:35:49.446 00:30:36 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:49.446 00:30:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:49.446 malloc0 00:35:49.446 [2024-07-16 00:30:36.263999] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:49.446 [2024-07-16 00:30:36.280274] tcp.c: 981:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:49.446 00:30:36 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:49.446 00:30:36 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:35:49.446 00:30:36 chaining -- bdev/chaining.sh@189 -- # bperfpid=3708770 00:35:49.447 00:30:36 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:49.447 00:30:36 chaining -- bdev/chaining.sh@191 -- # waitforlisten 3708770 /var/tmp/bperf.sock 00:35:49.447 00:30:36 chaining -- common/autotest_common.sh@829 -- # '[' -z 3708770 ']' 00:35:49.447 00:30:36 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:49.447 00:30:36 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:49.447 00:30:36 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:49.447 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:49.447 00:30:36 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:49.447 00:30:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:49.447 [2024-07-16 00:30:36.355056] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:49.447 [2024-07-16 00:30:36.355123] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3708770 ] 00:35:49.704 [2024-07-16 00:30:36.485392] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:49.704 [2024-07-16 00:30:36.587204] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:50.673 00:30:37 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:50.673 00:30:37 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:50.673 00:30:37 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:35:50.673 00:30:37 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:50.931 [2024-07-16 00:30:37.692838] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:50.931 nvme0n1 00:35:50.931 true 00:35:50.931 crypto0 00:35:50.931 00:30:37 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:50.931 Running I/O for 5 seconds... 00:35:56.234 00:35:56.234 Latency(us) 00:35:56.234 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:56.234 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:56.234 Verification LBA range: start 0x0 length 0x2000 00:35:56.234 crypto0 : 5.03 6860.25 26.80 0.00 0.00 37183.78 3675.71 27240.18 00:35:56.234 =================================================================================================================== 00:35:56.234 Total : 6860.25 26.80 0.00 0.00 37183.78 3675.71 27240.18 00:35:56.234 0 00:35:56.234 00:30:42 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:35:56.234 00:30:42 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:56.234 00:30:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:56.234 00:30:42 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:56.234 00:30:42 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:56.234 00:30:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:56.234 00:30:42 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:56.234 00:30:42 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:56.234 00:30:42 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:56.234 00:30:42 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@205 -- # sequence=68964 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:56.234 00:30:43 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@206 -- # encrypt=34482 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:56.500 00:30:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@207 -- # decrypt=34482 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:56.761 00:30:43 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:57.019 00:30:43 chaining -- bdev/chaining.sh@208 -- # crc32c=68964 00:35:57.019 00:30:43 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:35:57.019 00:30:43 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:35:57.019 00:30:43 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:35:57.019 00:30:43 chaining -- bdev/chaining.sh@214 -- # killprocess 3708770 00:35:57.019 00:30:43 chaining -- common/autotest_common.sh@948 -- # '[' -z 3708770 ']' 00:35:57.019 00:30:43 chaining -- common/autotest_common.sh@952 -- # kill -0 3708770 00:35:57.019 00:30:43 chaining -- common/autotest_common.sh@953 -- # uname 00:35:57.019 00:30:43 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:57.020 00:30:43 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3708770 00:35:57.020 00:30:43 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:57.020 00:30:43 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:57.020 00:30:43 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3708770' 00:35:57.020 killing process with pid 3708770 00:35:57.020 00:30:43 chaining -- common/autotest_common.sh@967 -- # kill 3708770 00:35:57.020 Received shutdown signal, test time was about 5.000000 seconds 00:35:57.020 00:35:57.020 Latency(us) 00:35:57.020 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:57.020 =================================================================================================================== 00:35:57.020 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:57.020 00:30:43 chaining -- common/autotest_common.sh@972 -- # wait 3708770 00:35:57.277 00:30:44 chaining -- bdev/chaining.sh@219 -- # bperfpid=3709829 00:35:57.277 00:30:44 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:35:57.277 00:30:44 chaining -- bdev/chaining.sh@221 -- # waitforlisten 3709829 /var/tmp/bperf.sock 00:35:57.278 00:30:44 chaining -- common/autotest_common.sh@829 -- # '[' -z 3709829 ']' 00:35:57.278 00:30:44 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:57.278 00:30:44 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:57.278 00:30:44 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:57.278 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:57.278 00:30:44 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:57.278 00:30:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:57.278 [2024-07-16 00:30:44.223649] Starting SPDK v24.09-pre git sha1 406b3b1b5 / DPDK 24.03.0 initialization... 00:35:57.278 [2024-07-16 00:30:44.223718] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3709829 ] 00:35:57.536 [2024-07-16 00:30:44.351137] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:57.536 [2024-07-16 00:30:44.448145] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:58.472 00:30:45 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:58.472 00:30:45 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:58.472 00:30:45 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:35:58.472 00:30:45 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:58.730 [2024-07-16 00:30:45.561720] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:58.730 nvme0n1 00:35:58.730 true 00:35:58.730 crypto0 00:35:58.730 00:30:45 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:58.989 Running I/O for 5 seconds... 00:36:04.259 00:36:04.259 Latency(us) 00:36:04.259 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:04.259 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:36:04.259 Verification LBA range: start 0x0 length 0x200 00:36:04.259 crypto0 : 5.01 1643.54 102.72 0.00 0.00 19092.29 1367.71 20515.62 00:36:04.259 =================================================================================================================== 00:36:04.259 Total : 1643.54 102.72 0.00 0.00 19092.29 1367.71 20515.62 00:36:04.259 0 00:36:04.259 00:30:50 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:36:04.259 00:30:50 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:36:04.259 00:30:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:04.259 00:30:50 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:04.259 00:30:50 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:04.259 00:30:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:04.259 00:30:50 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:04.259 00:30:50 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:36:04.259 00:30:50 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:04.259 00:30:50 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@233 -- # sequence=16452 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:04.518 00:30:51 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@234 -- # encrypt=8226 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:04.777 00:30:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@235 -- # decrypt=8226 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:05.036 00:30:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:36:05.295 00:30:52 chaining -- bdev/chaining.sh@236 -- # crc32c=16452 00:36:05.295 00:30:52 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:36:05.295 00:30:52 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:36:05.295 00:30:52 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:36:05.295 00:30:52 chaining -- bdev/chaining.sh@242 -- # killprocess 3709829 00:36:05.295 00:30:52 chaining -- common/autotest_common.sh@948 -- # '[' -z 3709829 ']' 00:36:05.295 00:30:52 chaining -- common/autotest_common.sh@952 -- # kill -0 3709829 00:36:05.295 00:30:52 chaining -- common/autotest_common.sh@953 -- # uname 00:36:05.295 00:30:52 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:05.295 00:30:52 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3709829 00:36:05.295 00:30:52 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:05.295 00:30:52 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:05.295 00:30:52 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3709829' 00:36:05.295 killing process with pid 3709829 00:36:05.295 00:30:52 chaining -- common/autotest_common.sh@967 -- # kill 3709829 00:36:05.295 Received shutdown signal, test time was about 5.000000 seconds 00:36:05.295 00:36:05.295 Latency(us) 00:36:05.295 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:05.295 =================================================================================================================== 00:36:05.295 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:05.295 00:30:52 chaining -- common/autotest_common.sh@972 -- # wait 3709829 00:36:05.554 00:30:52 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@117 -- # sync 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@120 -- # set +e 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:36:05.554 rmmod nvme_tcp 00:36:05.554 rmmod nvme_fabrics 00:36:05.554 rmmod nvme_keyring 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@124 -- # set -e 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@125 -- # return 0 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@489 -- # '[' -n 3708735 ']' 00:36:05.554 00:30:52 chaining -- nvmf/common.sh@490 -- # killprocess 3708735 00:36:05.554 00:30:52 chaining -- common/autotest_common.sh@948 -- # '[' -z 3708735 ']' 00:36:05.554 00:30:52 chaining -- common/autotest_common.sh@952 -- # kill -0 3708735 00:36:05.554 00:30:52 chaining -- common/autotest_common.sh@953 -- # uname 00:36:05.554 00:30:52 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:05.554 00:30:52 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3708735 00:36:05.554 00:30:52 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:36:05.554 00:30:52 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:36:05.554 00:30:52 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3708735' 00:36:05.554 killing process with pid 3708735 00:36:05.554 00:30:52 chaining -- common/autotest_common.sh@967 -- # kill 3708735 00:36:05.554 00:30:52 chaining -- common/autotest_common.sh@972 -- # wait 3708735 00:36:06.122 00:30:52 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:36:06.122 00:30:52 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:36:06.122 00:30:52 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:36:06.122 00:30:52 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:36:06.122 00:30:52 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:36:06.122 00:30:52 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:06.122 00:30:52 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:06.122 00:30:52 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:06.122 00:30:52 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:36:06.122 00:30:52 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:36:06.122 00:36:06.122 real 0m47.020s 00:36:06.122 user 1m0.657s 00:36:06.122 sys 0m14.114s 00:36:06.122 00:30:52 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:06.122 00:30:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:06.122 ************************************ 00:36:06.122 END TEST chaining 00:36:06.122 ************************************ 00:36:06.122 00:30:52 -- common/autotest_common.sh@1142 -- # return 0 00:36:06.122 00:30:52 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:36:06.122 00:30:52 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:36:06.122 00:30:52 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:36:06.122 00:30:52 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:36:06.122 00:30:52 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:36:06.122 00:30:52 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:36:06.122 00:30:52 -- common/autotest_common.sh@722 -- # xtrace_disable 00:36:06.122 00:30:52 -- common/autotest_common.sh@10 -- # set +x 00:36:06.122 00:30:52 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:36:06.122 00:30:52 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:36:06.122 00:30:52 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:36:06.122 00:30:52 -- common/autotest_common.sh@10 -- # set +x 00:36:11.391 INFO: APP EXITING 00:36:11.391 INFO: killing all VMs 00:36:11.391 INFO: killing vhost app 00:36:11.391 INFO: EXIT DONE 00:36:14.687 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:14.687 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:14.687 Waiting for block devices as requested 00:36:14.687 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:36:14.687 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:14.687 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:14.946 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:14.946 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:14.946 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:15.205 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:15.205 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:15.205 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:15.464 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:15.464 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:15.464 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:15.723 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:15.723 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:15.723 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:15.983 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:15.983 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:20.241 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:20.241 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:20.241 Cleaning 00:36:20.241 Removing: /var/run/dpdk/spdk0/config 00:36:20.241 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:36:20.241 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:36:20.241 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:36:20.241 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:36:20.241 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:36:20.241 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:36:20.241 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:36:20.241 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:36:20.241 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:36:20.241 Removing: /var/run/dpdk/spdk0/hugepage_info 00:36:20.241 Removing: /dev/shm/nvmf_trace.0 00:36:20.241 Removing: /dev/shm/spdk_tgt_trace.pid3447063 00:36:20.241 Removing: /var/run/dpdk/spdk0 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3446203 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3447063 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3447597 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3448332 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3448514 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3449280 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3449455 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3449742 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3452371 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3453886 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3454115 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3454357 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3454766 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3455007 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3455204 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3455404 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3455672 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3456382 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3459073 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3459274 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3459505 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3459867 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3459917 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3460139 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3460341 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3460539 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3460736 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3460964 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3461266 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3461492 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3461683 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3461889 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3462086 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3462363 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3462637 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3462833 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3463034 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3463237 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3463454 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3463787 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3463987 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3464193 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3464385 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3464584 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3464959 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3465356 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3465736 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3466391 00:36:20.241 Removing: /var/run/dpdk/spdk_pid3466756 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3466963 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3467325 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3467696 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3467760 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3468179 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3468576 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3468858 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3469053 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3473353 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3475060 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3476692 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3477551 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3478724 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3479042 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3479114 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3479142 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3483082 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3483469 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3484472 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3484736 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3490403 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3492503 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3493696 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3497944 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3499575 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3500546 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3504611 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3507040 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3507972 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3517755 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3520485 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3521464 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3531188 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3533234 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3534217 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3544479 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3548939 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3549993 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3560996 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3563396 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3564580 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3576865 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3579463 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3580780 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3591694 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3595544 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3596528 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3597675 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3601214 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3606563 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3609251 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3613734 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3617132 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3622548 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3625724 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3632530 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3634778 00:36:20.501 Removing: /var/run/dpdk/spdk_pid3640890 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3643304 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3649596 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3652866 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3657356 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3657711 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3658070 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3658423 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3659000 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3659632 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3660473 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3660910 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3662528 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3664285 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3666050 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3667361 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3668957 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3670561 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3672160 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3673639 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3674192 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3674563 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3677050 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3679100 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3680944 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3682008 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3683238 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3683780 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3683811 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3684035 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3684241 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3684428 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3685667 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3687180 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3688678 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3689399 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3690274 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3690475 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3690505 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3690688 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3691631 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3692174 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3692544 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3694778 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3696603 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3698414 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3699478 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3700738 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3701502 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3701604 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3705858 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3705994 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3706105 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3706298 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3706506 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3706727 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3707522 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3708770 00:36:20.760 Removing: /var/run/dpdk/spdk_pid3709829 00:36:21.019 Clean 00:36:21.019 00:31:07 -- common/autotest_common.sh@1451 -- # return 0 00:36:21.019 00:31:07 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:36:21.019 00:31:07 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:21.019 00:31:07 -- common/autotest_common.sh@10 -- # set +x 00:36:21.019 00:31:07 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:36:21.019 00:31:07 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:21.019 00:31:07 -- common/autotest_common.sh@10 -- # set +x 00:36:21.019 00:31:07 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:21.019 00:31:07 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:36:21.019 00:31:07 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:36:21.019 00:31:07 -- spdk/autotest.sh@391 -- # hash lcov 00:36:21.019 00:31:07 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:36:21.019 00:31:07 -- spdk/autotest.sh@393 -- # hostname 00:36:21.019 00:31:07 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:36:21.278 geninfo: WARNING: invalid characters removed from testname! 00:36:43.221 00:31:27 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:44.597 00:31:31 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:47.885 00:31:34 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:49.790 00:31:36 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:53.077 00:31:39 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:55.605 00:31:41 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:58.159 00:31:44 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:36:58.159 00:31:44 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:36:58.159 00:31:44 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:36:58.159 00:31:44 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:36:58.159 00:31:44 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:36:58.159 00:31:44 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:58.159 00:31:44 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:58.159 00:31:44 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:58.159 00:31:44 -- paths/export.sh@5 -- $ export PATH 00:36:58.159 00:31:44 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:58.159 00:31:44 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:58.159 00:31:44 -- common/autobuild_common.sh@444 -- $ date +%s 00:36:58.159 00:31:44 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721082704.XXXXXX 00:36:58.159 00:31:44 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721082704.9d7wLf 00:36:58.159 00:31:44 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:36:58.159 00:31:44 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:36:58.159 00:31:44 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:36:58.159 00:31:44 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:36:58.159 00:31:44 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:36:58.159 00:31:44 -- common/autobuild_common.sh@460 -- $ get_config_params 00:36:58.159 00:31:44 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:36:58.159 00:31:44 -- common/autotest_common.sh@10 -- $ set +x 00:36:58.160 00:31:44 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:36:58.160 00:31:44 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:36:58.160 00:31:44 -- pm/common@17 -- $ local monitor 00:36:58.160 00:31:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:58.160 00:31:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:58.160 00:31:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:58.160 00:31:44 -- pm/common@21 -- $ date +%s 00:36:58.160 00:31:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:58.160 00:31:44 -- pm/common@21 -- $ date +%s 00:36:58.160 00:31:44 -- pm/common@25 -- $ sleep 1 00:36:58.160 00:31:44 -- pm/common@21 -- $ date +%s 00:36:58.160 00:31:44 -- pm/common@21 -- $ date +%s 00:36:58.160 00:31:44 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721082704 00:36:58.160 00:31:44 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721082704 00:36:58.160 00:31:44 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721082704 00:36:58.160 00:31:44 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721082704 00:36:58.160 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721082704_collect-cpu-load.pm.log 00:36:58.160 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721082704_collect-vmstat.pm.log 00:36:58.160 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721082704_collect-cpu-temp.pm.log 00:36:58.160 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721082704_collect-bmc-pm.bmc.pm.log 00:36:59.098 00:31:45 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:36:59.098 00:31:45 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:36:59.098 00:31:45 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:59.098 00:31:45 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:36:59.098 00:31:45 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:36:59.098 00:31:45 -- spdk/autopackage.sh@19 -- $ timing_finish 00:36:59.098 00:31:45 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:36:59.098 00:31:45 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:36:59.098 00:31:45 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:59.098 00:31:45 -- spdk/autopackage.sh@20 -- $ exit 0 00:36:59.098 00:31:45 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:36:59.098 00:31:45 -- pm/common@29 -- $ signal_monitor_resources TERM 00:36:59.098 00:31:45 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:36:59.098 00:31:45 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:59.098 00:31:45 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:36:59.098 00:31:45 -- pm/common@44 -- $ pid=3720727 00:36:59.098 00:31:45 -- pm/common@50 -- $ kill -TERM 3720727 00:36:59.098 00:31:45 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:59.098 00:31:45 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:36:59.098 00:31:45 -- pm/common@44 -- $ pid=3720729 00:36:59.098 00:31:45 -- pm/common@50 -- $ kill -TERM 3720729 00:36:59.098 00:31:45 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:59.098 00:31:45 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:36:59.098 00:31:45 -- pm/common@44 -- $ pid=3720731 00:36:59.098 00:31:45 -- pm/common@50 -- $ kill -TERM 3720731 00:36:59.098 00:31:45 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:59.098 00:31:45 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:36:59.098 00:31:45 -- pm/common@44 -- $ pid=3720755 00:36:59.098 00:31:45 -- pm/common@50 -- $ sudo -E kill -TERM 3720755 00:36:59.098 + [[ -n 3330766 ]] 00:36:59.098 + sudo kill 3330766 00:36:59.108 [Pipeline] } 00:36:59.133 [Pipeline] // stage 00:36:59.140 [Pipeline] } 00:36:59.159 [Pipeline] // timeout 00:36:59.166 [Pipeline] } 00:36:59.185 [Pipeline] // catchError 00:36:59.193 [Pipeline] } 00:36:59.211 [Pipeline] // wrap 00:36:59.219 [Pipeline] } 00:36:59.237 [Pipeline] // catchError 00:36:59.247 [Pipeline] stage 00:36:59.249 [Pipeline] { (Epilogue) 00:36:59.262 [Pipeline] catchError 00:36:59.263 [Pipeline] { 00:36:59.305 [Pipeline] echo 00:36:59.308 Cleanup processes 00:36:59.317 [Pipeline] sh 00:36:59.600 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:59.600 3720830 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:36:59.600 3721051 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:59.615 [Pipeline] sh 00:36:59.898 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:59.898 ++ grep -v 'sudo pgrep' 00:36:59.898 ++ awk '{print $1}' 00:36:59.898 + sudo kill -9 3720830 00:36:59.909 [Pipeline] sh 00:37:00.194 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:37:12.417 [Pipeline] sh 00:37:12.700 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:37:12.959 Artifacts sizes are good 00:37:12.974 [Pipeline] archiveArtifacts 00:37:12.981 Archiving artifacts 00:37:13.144 [Pipeline] sh 00:37:13.427 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:37:13.441 [Pipeline] cleanWs 00:37:13.451 [WS-CLEANUP] Deleting project workspace... 00:37:13.451 [WS-CLEANUP] Deferred wipeout is used... 00:37:13.458 [WS-CLEANUP] done 00:37:13.460 [Pipeline] } 00:37:13.482 [Pipeline] // catchError 00:37:13.494 [Pipeline] sh 00:37:13.774 + logger -p user.info -t JENKINS-CI 00:37:13.783 [Pipeline] } 00:37:13.800 [Pipeline] // stage 00:37:13.806 [Pipeline] } 00:37:13.826 [Pipeline] // node 00:37:13.832 [Pipeline] End of Pipeline 00:37:13.866 Finished: SUCCESS